Free Essay

Stereo Acoustic Perception Based on Real Time Video Acquisition

In: Computers and Technology

Submitted By AnushreeShetty
Words 2605
Pages 11
ACOUSTIC VISION –
Acoustic Perception Based On Real Time
Video Acquisition for Navigation Assistance
Supreeth K Rao#, Arpitha Prasad B*, Anushree R Shetty&, Chinmai$ , Rajeshwari Hegde@
#,*,&,,

Department of Telecommunication Engineering, @ Guide and faculty
BMS College of Engineering,
Bangalore, India
#

supreethkrao@gmail.com arpithaprasad@gmail.com & anushree.shetty12@gmail.com $ cpchinmai@gmail.com *

Abstract—
A smart navigation system based on an object detection mechanism has been designed to detect the presence of obstacles that immediately impede the path, by means of real time video processing. This paper is discussed keeping in mind the navigation of the visually impaired.
A video camera feeds images of the surroundings to a
Da-Vinci Digital Media Processor, DM642, which works on the video, frame by frame. The processor carries out image processing techniques whose result contains information about the object in terms of image pixels. The algorithm aims to select that object, among all others, that poses maximum threat to the navigation. A database containing a total of three sounds is constructed. Hence, each image translates to a beep, where every beep informs the navigator of the obstacles directly in front of him. This paper implements a more efficient algorithm compared to its predecessor, NAVI.
Keywords— Navigation, Edge Detection, Flood Function,
Object Detection, DM642, Acoustic Transformation

I. INTRODUCTION
Assistance for the blind or visually impaired can range from simple measures, such as a white cane or a guide dog, to a very sophisticated computer technology (enhanced imaging, synthetic speech, optical character recognition, etc.). Many of those who are visually impaired can maintain their current employment or be trained for new work with the help of such aids.
This paper deals with a vision substitution system that is based on an image to sound conversion concept. This finds particular applications for the

navigation of the visually impaired and even in the case of autonomous intelligent rovers. The output of this system can either be fed as an actuation to a smart control system, or converted to an audio signal which is fed to the blind person‟s earphones. The paper aims at creating a portable system that allows visually impaired individuals to travel through familiar and unfamiliar environments without the assistance of guides.

II. PROPOSED ALGORITHM
A vision acquisition device such as a video camera captures the information about the system surroundings. Image frames are procured from the video and subjected to a series of image processing techniques. Figure 1 shows the block diagram describing an overview of the algorithm used for the system implementation. The algorithm entrusts significant weightage to the location and size of the objects in the image under consideration. A flood function has been designed to calculate the same.
These parameters are then used to assign priorities to the objects based on proximity and size. The object that has gained the highest priority is selected and depending upon its proximity and size, acoustic transformation is performed resulting in one out of two sounds. This sound is what gives the ultimate intended information about the surroundings that helps the user to have a collision free navigation.
Object recognition has been implemented to inform the user about what obstacles are present before him, rather than just their presence.

VIDEO ACQUISITION

the resolution). The above has been selected keeping in mind the data loss in resizing.

IMAGE EXTRACTION

B. Edge detection:

RESIZING [32x32]

RGB COMPONENT EXTRACTION
RED

GREEN

BLUE

EDGE DETECTION (CANNY)

COMBINE EDGES [LOGICAL OR]

DILATION

FILL

EROSION

FLOOD FUNCTION

The objects present in an image are recognized by their boundaries. Edge detection is a technique that extracts the edges of the objects. There are various types of edge detection: sobel, canny, prewitt, laplacian to name a few, and the algorithm is friendly to both canny and sobel. The image that is acquired is a colour image consisting of red, green and blue
(RGB) components [1]. Colour image segmentation offers greater accuracy when compared to edge detection in grayscale images [2]. Hence, we extract the RGB colour components of the resized image and perform edge detection on each. The results are combined by performing a logical OR operation on the three edge detected images to obtain a single binary image with clearly defined objects whose outline is in white.
C. Dilation
Morphological processing involves operations that process images based on shapes. They apply a structuring element to the input image that suitably altering it. Dilation and erosion are the two morphological operations that the algorithm uses.
Dilation has been used to connect broken edges in the edge detected image. A structural element of 2x3
„one‟s is appropriately chosen to dilate every white pixel in the binary image. This leads to thickening of the edges, hence connecting minor breaks.

OBJECT PREFERENCE

D. Fill:
ACOUSTIC TRANSFORMATION

AUDIO SIGNAL

Fig. 1: Proposed Algorithm
III.

METHODOLOGY

The proposed methodology makes use of the following standard preprocessing techniques:
A. Resizing
The image captured is resized to 32x32. This is mainly to achieve the real time processing prerequisite of small computational time, but it also provides flexibility of changing the camera (and thus

As the size of the object has to be calculated, the area within each object must be obtained i.e., the number of pixels constituting the object. The image as of now consists of objects outlined in white. The fill function, when applied to the above image, fills in the black area within objects with white pixels. It can thus be seen that the object size is the number of white pixels it contains.
E. Erosion:
Dilation adds pixels to the boundaries of objects in an image, while erosion removes pixels on object boundaries. While this sharpens the dilated objects, it also removes unwanted white specks, which would otherwise be viewed as small objects themselves.
The structural element here is a disk with radius one.

F. Proximal Area
The location of the objects is an important aspect for collision free navigation. The presence of objects in certain parts of the image poses greater danger to the navigator than their presence in other parts. To understand this, let us divide an image into four parts, left, right, centre, back, as shown in the figure(2) below.
Back

L e f t C

R

to the navigator. Therefore, we can say that this region of the image should be given higher preference when compared to the other regions in the image. To classify the objects based on their location, the frame has been divided into three regions (Fig. 4), two of which constitute the high priority region –
Proximal area consisting of A1 & A2 (Fig.2).
Within the Proximal area, objects present in A1 are assigned the highest priority, while those present in
A2 are assigned a lower priority.

32x32

16x16

Fig. 2: Analysis of the image frame
A2

Consider the following cases in Fig. 3 with the assumption that the user is walking straight ahead:

8x16
A1

Fig. 4: Proximal Area

Fig. 3: Selection of Proximal Area

Case (a): An object is in the left(L) portion of the image and it is apparent that this object does not cause any obstruction to the navigator.
Case (b): The object in the right(R) portion of the image. This again will not obstruct the path of the person. Case (c): Here, the object is at the back portion of the image. As we can see, the user can walk a short distance before he encounters this object.
Case (d): In this figure, however, where the object is present in the central portion of the image, we can see that the object causes an immediate obstruction

This paper assigns preferences to A1 and A2 through the use of masks M1 and M2 consisting of ones. M1
A1
is a mask of dimension 24x16 highlighting the objects in the areas A1 and A2. M2 is a mask of dimension 8x16 highlighting the objects in A1only.
Using these masks, the following operations are performed: M1 = Image & M1; (objects in A1 and A2 are highlighted) M2 = Image & M2; (objects in A1 alone are highlighted) Image = Image + M1 + M2;
This results in the image consisting of ones, twos and threes in the regions outside the proximal area, in A2 and in A1 respectively.

G. Flood Function
The objects in the image consist of ones, twos and threes; hence, calculating the object size would be to calculate the total number of ones, twos and threes.
Also, the concentration of objects (which is the number of pixels) in A1 and A2 have to be calculated. This would mean counting the number of twos to calculate object concentration in A2, and threes to calculate the object concentration in A1.
To calculate the size of the objects present in the frame, and its concentration in A1 and A2, a flood function is designed.

The image is first searched for a one, two or three, and the flood function is called when these pixel intensities are encountered. The function called at a pixel spreads to its neighbours if its pixel intensities are not zeros. The function has been defined for a connectivity of eight, i.e, all eight neighbours of each pixel is checked for either a one, two or three. In effect, when a flood function is called at a pixel of an object, then it spreads to cover the entire object till it reaches the object‟s boundaries. During flooding, pixel count can be incremented (which would be the object size) and also, the number of twos and threes can be counted (which would mean the object concentration in A2 and A1 respectively). In order to prevent the function from flooding into already explored pixels, the intensities of those pixels where the function has been called are changed to zero.
Therefore, not only does the function count the required values, it also shrinks the object into inexistence. This has no consequences as all necessary information has already been gathered.
The scanning continues, and the procedure is repeated as and when other objects are encountered.
A database containing object sizes and concentrations is thus created.
H. Priority Assignment

Objects falling in the Proximal Area should be given high priority. Not only should this been done, but importance should be given to the object size.
Consider the following conflicting cases:
Case
1: If there are two objects lying in A2, the larger object should be given higher priority. Case 2: If there is a large object in A2 and a smaller object in
A1, as the user encounters the object in A1 first, objects in A1 should be given more priority than those in A2, regardless of the difference in size.
Case 3: If a small percentage of a huge object lies within A1and A2, and an object of smaller size lies only within A2, then that object whose concentration within A1 or
A2 is more gains higher priority.

user continues on his path and (iii) no sound, which informs that there is no object yet that can pose a threat. IV.

HARDWARE IMPLEMENTATION

The system has used the Digital Media Processor
TMS320DM642 (Version 3), which belongs to the
Da Vinci family of Texas Instruments‟ C6000 series.
The DM642 Evaluation Module (EVM) is a lowcost, high performance video & imaging development platform designed to jump-start application development and evaluation of multichannel, multi-format digital and other future proof applications. DM642 has been specially designed for real time video and audio processing, with dedicated video encoders and a decoder. Leveraging the high performance of the TMS320C64x DSP core, this development platform supports TI‟s
TMS320DM642, DM641 & DM640 digital media processors. The TMS320C64x™ DSPs (including the TMS320DM642 device) are the highestperformance fixed-point DSP generation in the
TMS320C6000™
DSP platform. The
TMS320DM642 device is based on the secondgeneration high-performance, advanced VelociTI™ very-long-instruction-word (VLIW) architecture
(VelociTI.2™) developed by Texas Instruments (TI), making these DSPs an excellent choice for digital media applications. DM642 offers a speed of
720MHz, 4 MBytes Flash, 32 MB of 133 MHz
SDRAM and 256 kbit I2C EEPROM[3].The JTAG emulator used to communicate with the processer is
XDS510USB Plus. A PAL (Phase Alternating Line) camera has been utilised to acquire the input (at 30 frames per second) to the system and a set of 3.5 mm jack ear-phones are used to provide the output of the system to the user.

I. Acoustic Transformation

Each object by now would have gained a level of priority. Among all objects, that object which has gained the highest priority must have its presence conveyed to the blind user. Thus, the algorithm translates the object into an audio signal, resulting in a beep. All priority levels are categorized into three sounds: (i) a sound of high pitch indicating the presence of an object posing greatest threat and implying that the user must immediately change his direction (ii) a sound of medium pitch indicating that an object would soon pose the greatest threat if the

Fig. 5: Hardware Implementation using DM642
V.

RESULTS

This section shows the results for a sample image: Fig. 6: Image Extraction & Resizing

assignments then take place to categorize detected objects based on the amount of threat they pose; and finally, an acoustic transformation is performed to translate the visual information to a meaningful sound. This is executed by the Da Vinci Digital
Media Processor, DM642, in half a second, and the real time system‟s execution dives again to the first step and the procedure is carried out repeatedly.
Neuroscience and psychology research indicate recruitment of relevant brain areas in seeing with sound, as well as functional improvement through training. However, the extent to which cortical plasticity allows for functionally relevant rewiring or remapping of the human brain is still largely unknown and is being investigated in an open collaboration with research partners around the world[4]. In effect, this paper is translating one sense organ‟s experience to be understood by another which eventually gets used to the user, owing to the neural plasticity of the human brain. The visually impaired people can now “see” by listening to the output of this algorithm.

VII.
Fig. 7: Image processing

A1:

9

0

0

0

0

A2:

105

0

0

0

0

Objsize:

114

9

0

0

0

Priority:

115

0

0

0

0

Highest priority: 115
Fig. 8: End results considering a maximum of five objects VI.

CONCLUSION

Acoustic vision is a sensory substitution system
(vision) that acquires, processes, analyses, and understands images from the real world and ultimately aims to provide a synthetic vision through sound, relevant to obstacle detection based navigation. Here, image extraction from a video input is carried out; boundaries of objects present in the image are identified and the objects‟ sizes are calculated. Importance is given to the size and proximity of objects through well-defined iris areas.
The flood function counts the size of objects and their concentration in these iris areas. Priority

REFERENCES

NAVI: An Improved Object Identification for NAVI,
Nagarajan R, Sainarayanan G, Yacoob S, Porle R.R,
TENCON IEEE Region 10 Conference, Volume A
[1] A Color Edge Detection Algorithm in RGB Color
Space
Soumya Dutta, Bidyut B. Chaudhuri
[2] International Conference on Methodsand
Modelsin ComputerScience,2009
Comparative Study of Image Segmentation
Techniques and Object Matching using
Segmentation
S Sapna Varshney, Navin Rajpa and Ravindar
Purwar
[3] Texas Instruments DM64x Digital Media
Developer's Kit
[4] Sensory substitution and the Human – machine interface, Paul Bach-y-Rita and Stephen W. Kercel
[5] TMS320DM642 Evaluation Module with TVP
Video Decoders – Technical Reference, 2004
Spectrum Digital.
[6] Driver Examples on the DM642 EVM – Texas
Instruments- Application Report, SPRA932 – August
2003.…...

Similar Documents

Premium Essay

Real-Time Fraud Detection

...architecture will stimulate research, and more importantly organizations, to invest in Analytics and Statistical Fraud-Scoring to be used in conjunction with the already in-place preventive techniques. Therefore, in this research we explore different strategies to build a Streambased Fraud Detection solution, using advanced Data Mining Algorithms and Statistical Analysis, and show how they lead to increased accuracy in the detection of fraud by at least 78% in our reference dataset. We also discuss how a combination of these strategies can be embedded in a Stream-based application to detect fraud in real-time. From this perspective, our experiments lead to an average processing time of 111,702ms per transaction, while strategies to further improve the performance are discussed. Keywords: Fraud Detection, Stream Computing, Real-Time Analysis, Fraud, Data Mining, Retail Banking Industry, Data Preprocessing, Data Classification, Behavior-based Models, Supervised Analysis, Semi-supervised Analysis Sammanfattning Privatbankerna har drabbats hårt av bedrägerier de senaste åren. Bedragare har lyckats kringgå forskning och tillgängliga system och lura bankerna och deras kunder. Därför vill vi införa en ny, polyvalent strömmande datorteknik (Stream Computing) för att upptäcka bedrägerier. Vi tror att denna struktur kommer att stimulera forskningen, och framför allt få organisationerna att investera i analytisk och statistisk bedrägerispårning som kan användas tillsammans med......

Words: 56858 - Pages: 228

Premium Essay

Towards Real-Time Customized Management of

...TOWARDS REAL-TIME CUSTOMIZED MANAGEMENT OF SUPPLY AND DEMAND CHAINS James M. TIEN Ananth KRISHNAMURTHY Ali YASAR Department of Decision Sciences and Engineering Systems Rensselaer Polytechnic Institute, 110 Eighth St. Troy, New York 12180 USA Abstract Our focus herein is on developing an effective taxonomy for the simultaneous and real-time management of supply and demand chains. More specifically, the taxonomy is developed in terms of its underpinning components and its research foci. From a components perspective, we first consider the value chain of supplier, manufacturer, assembler, retailer, and customer, and then develop a consistent set of definitions for supply and demand chains based on the location of the customer order penetration point. From a research perspective, we classify the methods that are employed in the management of these chains, based on whether supply and/or demand are flexible or fixed. Interestingly, our taxonomy highlights a very critical research area at which both supply and demand are flexible, thus manageable. Simultaneous management of supply and demand chains sets the stage for mass customization which is concerned with meeting the needs of an individualized customer market. Simultaneous and real-time management of supply and demand chains set the stage for real-time mass customization (e.g., wherein a tailor first laser scans an individual’s upper torso and then delivers a uniquely fitted jacket within a......

Words: 10304 - Pages: 42

Free Essay

Acoustic Emissions

...J O U R N A L O F M A T E R I A L S S C I E N C E 3 4 (1 9 9 9 ) 4995 – 5004 Tribological behaviour and acoustic emissions of alumina, silicon nitride and SAE52100 under dry sliding H. S. BENABDALLAH, R. J. BONESS Department of Mechanical Engineering, Royal Military College of Canada, PO Box 17000 Stn Forces, Kingston, Ontario, Canada, K7K 7B4 E-mail: benabdallah-h@rmc.ca The friction, wear and acoustic emission behaviour of various combinations of alumina, silicon nitride, and SAE52100 steel, operating under dry sliding conditions, was investigated. A designed ball-on-flat-disc type of tribometer was used to conduct these experiments. This apparatus, equipped with a force sensor, using silicon strain gauges, measured simultaneously the normal load and friction force. Both forces were used to determine the real-time value of the dynamic coefficient of friction. The AE signal arising from the interaction of the surfaces in dynamic contact was also detected and a data acquisition system was used to gather this signal as well as the outputs from the force sensor, at high frequency. The effects of test duration, sliding speed and normal load on the above mentioned tribological parameters were evaluated. The interest of this study further extended to assess the correlations that may exist between the integrated rms acoustic signal (AE) and the friction mechanisms, wear volume, friction work as well as the material removal power. Under the specific conditions of the present......

Words: 6799 - Pages: 28

Free Essay

Real Time Systems

...Real time Systems:- Definition:- A real time system is basically an information processing system which has to respond to externally generated input stimuli within a specific timeframe. * The correctness depends on the time it was delivered not only on the logical result it generally produced. * Failure encounters a wrong response A real time system is actually a system that acts to the event within a specified time period. A real time system depicts those systems which have to perform certain calculation within specific time period, delayed results may be considered wrong. In 2001, Shaw explained that real time systems have a very limited time constraints which are defined by the systems requirements, which reflects the aspects of outer world. Real time systems are now become important part of this modern age, safety and privacy issues and performance of such systems have also been raised .In 2000 Halang and Druzovec explained that performance in real time system depends on higher bandwidth, faster response and faster computation power. A real time systems state is dependent on a function of physical time eg; a chemical reaction continues to change its state from one state to another even after the supervising computer system is stopped. This real time system can easily be decomposed into a set of subsystems i.e. human operator, controlled object, real time computer system. A real time system must react to the stimuli with the help of the controlled object within......

Words: 3481 - Pages: 14

Premium Essay

Lean Based Time & Motion Analysis

...Chapter 1 Inroduction Paint History Chapter 1 1.1 Paint History: Paint is one of the oldest synthetic substances known, with a history stretching back into prehistoric times. Prehistoric man made it more than 35,000 years ago as they mixed clays and chalks with animal fats and used these paints to depict their hunts on cave walls. By 2500BC, Egyptians had improved the technology by developing a clear blue pigment by grinding azurite, gums, wax and egg white as binders and solvents for their paints. Greeks learnt to blend paints with hot wax, rather than water, making paint both thicker and easier to spread. By this time, colours were available from natural and synthetic sources, one of the most interesting being a purple pigment made from heating yellow earth till it turned red and then plunging it into vinegar. In the eighteenth century, paint factories began to open in Europe and America, and by the nineteenth century, mass production had brought prices down where houses began to be painted. In the twentieth century, paint manufacturing and its function is widely understood especially from chemical point of view, meaning that paint manufacture has finally moved from being an art to being a science. 1.2 Paint Importance and Its Production: Paints are used for the protection and appearance enhancement of houses, buildings, cars, ships and many more. They are used as a safety feature to grab person attention like in lane markings on road......

Words: 13854 - Pages: 56

Premium Essay

Real Time Business Intelligence

...web architecture all combine to create a richer business intelligence environment than was available previously. Although business intelligence systems are widely used in industry, research about them is limited. This paper, in addition to being a tutorial, proposes a BI framework and potential research topics. The framework highlights the importance of unstructured data and discusses the need to develop BI tools for its acquisition, integration, cleanup, search, analysis, and delivery. In addition, this paper explores a matrix for BI data types (structured vs. unstructured) and data sources (internal and external) to guide research. KEYWORDS: business intelligence, competitive intelligence, unstructured data I. INTRODUCTION Demand for Business Intelligence (BI) applications continues to grow even at a time when demand for most information technology (IT) products is soft [Soejarto, 2003; Whiting, 2003]. Yet, information systems (IS) research in this field is, to put it charitably, sparse. While the term Business Intelligence is relatively new, computer-based business intelligence systems appeared, in one guise or other, close to forty years ago.1 BI as a term replaced decision support, executive information systems, and management information systems [Thomsen, 2003]. With each new iteration, capabilities increased as enterprises grew ever-more sophisticated in their computational and analytical needs and as computer hardware and software matured. In this paper BI systems......

Words: 8282 - Pages: 34

Premium Essay

Perceptions of Brain Based Learning

...Running Head: PERCEPTIONS OF BRAIN-BASED LEARNING 2 Brain-based learning theory has devised a new discipline known by some as educational neuroscience, or by others as mind, brain, and education science (Duman, 2010). It is a broad and comprehensive approach to instruction using current research from neuroscience. Brain-based education emphasizes the manner in which the brain learns naturally and is based on what is currently known about the structure and function of the human brain. This theory is a concept that includes an eclectic mix of teaching techniques. BBL practices call for teachers to connect learning to students’ real lives and emotional experiences, as well as combining strategies like problem based and mastery learning and considers learning styles of individual students. Opponents of brain-based learning strategies argue that neuroscience alone cannot provide usable knowledge that translates into positive teaching strategies (Clement & Lovat, 2008). However, teachers and researchers who are implementing, and testing BBL strategies, contend that a working knowledge of neural functioning is paramount as educators look for successful ways to address individual learning needs. Balil Duman ((2010) shares that several educators and brain researchers have conducted and produced research that reveals that individuals learn in different ways thus multi-dimensional teaching models should be used to transmit information to students. Caine and......

Words: 1299 - Pages: 6

Free Essay

Summary of Time-Based Activity-Based Costing

...Mark Hamrick & Dakata Brodie 17 Nov 2013 Assignment 2.1 Summary of Time-Based Activity-Based Costing This article’s purpose is to explain the new version of Activity-Based Costing that uses approximations to determine time-driven Activity-Based Costing. Base on rate of technological growth and dealing with companies on larger scales, the traditional method of ABC is very cumbersome. The new time-driven activity-based costing is a much more effective technique. The article further went on to contrast the two methods of costing by showcasing traditional inefficiencies, demonstrating the ease of use for the new method, and applying real-time data to increase situational awareness. The inefficiencies of the traditional activity-based costing were the primary reasons for the new time-driven method. The article explains how the process was so cumbersome that Excel programs were not powerful enough to crunch all the numbers. Tracking information was overbearing and required dedicated teams more than a day to interpret the data. Ease of use was a primary concern for anyone to venture back to the new activity-based costing. By simply averaging the amounts and getting ballpark figures, teams could use formulas to effectively average the costs based on the average time to complete a job, not each individual task. This method didn’t require thousands of surveys to determine the cost-only estimates. This reduced the number of items tracked and provided......

Words: 424 - Pages: 2

Premium Essay

Perception and Leisure Time

...Perception and leisure time CHAPTER 2 INTRODUCTION This chapter concludes definitions of each research construct are finally described. Literature review is a documentation of a comprehensive review of the published and unpublished work from secondary sources of data in the area of specific interest to the researcher. This chapter explains the literature review, which will address the research question. In order to complete this chapter, sources are taken from journal, books, the internet and some certified prime media. The literature review should bring together all relevant information in a cogent and logical manner instead of presenting all the studies in chronalogical order with the bits and pieces of uncoordinated information. BODY Leisure time is time available spent away from business, work, and domestic chores for ease and relaxation. It also excludes time spent on necessary activities such as sleeping and, where it is compulsory, education. Everyone needs a little bit of relaxation in their life because leisure time is important. A person need relieve stress, have time to themselves, and recharge their body and mind because life should not be all work, no play. Among this, it show leisure time is very important for one's mental health in their life. If the mind and body to get the revival it needs after a long week's hard work, can get a fresh idea to their life and work because a person need to have some enjoyment in life. A person can perform activities well......

Words: 650 - Pages: 3

Free Essay

Real-Time Transport Protocol

...NETW320 Real-Time Transport Protocol Real-Time Transport Protocol, more simply known as RTP, is a two part standard of how to manage real-time transmissions of multimedia data across a network. This two part standard first contains RTP, which is responsible for data transport and flow. The second half is Real-Time Control Protocol (RTCP) which is the control portion that monitors RTP transmission, collects data and compensates for lost packets and jitter, as well as handles Quality of Service (QoS). In general RTP is run over the User Datagram Protocol (UDP), but it is capable of running over other protocols such as TCP as well. With today’s converged networks and their inclusion of VoIP and the heavy use of the Session Initiation Protocol or SIP, it is easy to understand why RTP is utilized. When you look into the relationship between RTP and SIP, you can clearly see that SIP relies on RTP. To fully understand their relation, we can break it down. First you have SIP. This protocol is responsible for setting up the connection from end to end. Once the connection is established, RTP takes over to transmit the data stream. While both SIP and RTP serve different functions all together, they rely on each other to send and receive the voice streams at either end. VoIP, however, is not the only use for RTP. It is used for a variety of audio and/or video streaming and its core design allows for the ability to support a plethora of formats, as well as the ability to......

Words: 576 - Pages: 3

Premium Essay

Real Time Hand Tracking for Human Computer Interaction

...Real-Time Hand Tracking for HumanComputer Interaction Ayush Tripathi, Kanishk Puri, Nilesh Srivastava, Prateek Dham (Students) Mrs S.S Dhotre(Professor) Computer Science Department Bharati Vidyapeeth Deemed University College of Engineering Pune, Maharashtra(India) Abstract- The proposed work is part of a project that aims for the control of a mouse based on hand gesture recognition. This goal implies the restriction of real-time response and unconstrained environments. This is basically a vision based skincolour segmentation method for moving hand in real time application [3]. This algorithm is based on three main steps: hand segmentation, hand tracking and gesture recognition from hand features. For the hand segmentation step we use the colour cue due to the characteristic colour values of human [1]. The hands are recognized by the computer using the skin colour as one of the basic features for the hand recognition. The important feature is the accurate segmentation of hands [3]. I. Introduction Nowadays, the majority of the human-computer interaction (HCI) is based on mechanical devices such as keyboards, mouse, joysticks or gamepads. In recent years there has been a growing interest in a class of methods based on computational vision due to its ability to recognise human gestures in a natural way .These methods use as input the images acquired from a camera or from a stereo pair of cameras. The main goal of these algorithms is to measure the hand configuration...

Words: 2141 - Pages: 9

Free Essay

Ethical Issues Arising from the Real Time Tracking and Monitoring of People Using Gps-Based Location Services

...Theses (Archive) University of Wollongong Thesis Collections 2005 Ethical Issues arising from the Real Time Tracking and Monitoring of People Using GPS-based Location Services A. Mcnamee University of Wollongong Publication Details This thesis was originally submitted as McNamee, A, Ethical Issues arising from the Real Time Tracking and Monitoring of People Using GPS-based Location Services, Bachelor of Information and Communication Technology (Honours), University of Wollongong, 2005,71p. Research Online is the open access institutional repository for the University of Wollongong. For further information contact the UOW Library: research-pubs@uow.edu.au Ethical Issues arising from the Real Time Tracking and Monitoring of People Using GPS-based Location Services Abstract The Global Positioning System is a constellation of 24 satellites which have the ability to calculate the position, time and velocity of any GPS receiver. Ethical concerns arise when a person carrying a receiver has their location transmitted to second party. This type of tracking has a wide variety of applications including tracking dementia sufferers, tracking parolees and law enforcement. A literature review found that the ethics of GPS tracking has not been thoroughly assessed. This paper investigates the ethical issues arising from the real time tracking of people using GPS-based location services. Usability context analysis and an observational study were the methodology used in......

Words: 20727 - Pages: 83

Premium Essay

Tutorial Implementing Activity-Based Management in an Acquisition Organization

...TUTORIAL Implementing Activity-Based Management in an Acquisition Organization IMPLEMENTING ACTIVITY-BASED MANAGEMENT IN AN ACQUISITION ORGANIZATION Diana I. Angelis To manage costs and comply with financial management laws and regulations, government acquisition organizations must first understand what they do and why they do it. This is critical to identifying customers, defining outputs, and developing systems to collect and trace the cost of resources to outputs. One of the more popular models for collecting and tracing costs is known as activitybased costing (ABC). This article examines how one government acquisition organization is using ABC to understand and define outputs and processes, to collect and trace the cost of doing business, and how it plans to use this information in the future. T he Federal Financial Management Improvement Act (FFMIA) of 1996 requires agencies to produce cost and financial information that will assist the Congress and financial managers with evaluating the cost and performance of federal programs and activities and thus improve decision making. The law is intended to increase the capability of agencies to monitor the execution of their budgets by providing better support for the preparation of reports that compare spending of resources to results of activities. This Act has provided the impetus for government agencies to understand, measure, and manage their costs. Air Force Materiel Command (AFMC),......

Words: 5764 - Pages: 24

Premium Essay

Real Time Analytics

...processed by conventional technologies and tools, within certain time period to make them useful. Big data is vital in fact that when huge information is successfully and effectively caught, prepared organizations can pick up a more finish comprehension of their business, clients, items, contenders, and so on. This can prompt effectiveness enhancements, expanded deals, lower costs, better client benefit, or enhanced items and administrations. Following are some of the examples of big data in different fields:  Utilizing information technology (IT) logs to enhance IT investigating and security rupture discovery, pace, viability, and future event avoidance.  Use of voluminous call focus data all the more rapidly, keeps in mind the end goal to enhance client association and fulfilment. Use of online networking content keeping in mind the end goal to better and more rapidly client feeling about you/your clients, and enhance items, administrations, and client association. Fraud detection and prevention in any industry that procedures budgetary exchanges on-line, for example, shopping, keeping money, contributing, protection and medicinal services claims.  Use of money related business sector exchange data to all the more rapidly evaluate hazard and make remedial move. Discuss some of the main data sources for Big Data. * Web sites * Social media * Machine generated * RFID * Image, video, and audio * GPS * * Discuss some of......

Words: 1729 - Pages: 7

Free Essay

Real Time Earthquakes

...Real Time Earthquakes Luis Perez Kaplan University July 31, 2012 The patterns that I observed I’ve noticed that the majority of earthquakes occur along the shorelines of California all the way to the coast of Washington. There are some small amounts of seismic activity in the central part of the United States and also on the eastern shore of the United States. Most of their earthquakes range up 6 to 8 on the scale. These earthquakes occur along the Pacific coast and continue southward along the Pacific coast and North along the Pacific coast. According to the map, there are a few states in the northern region that don’t see any seismic activity including parts of Texas and Florida. Looking at the map currently I live at El Paso, TX and it shows it in the 8-16 area that is susceptible to an earthquake, but in the zone where my home of record is located, the state of California they are more susceptible in having big major earthquakes for example; the San Francisco earthquake in 1989. The earthquake zone for that area is shaded in with the red and orange which the highest in the scale. According to the data from USGS, in the past 7 days the State of California recorded 207 earthquakes. Earthquakes happen around the world on a daily basis in Japan, South America, and other regions of the world. Most of the earthquakes appear along the western coast of the United States and in different parts of the ocean. Patterns also show a lot of the earthquakes that occur in the......

Words: 822 - Pages: 4