Blue Eye

Only available on StudyMode
  • Download(s) : 71
  • Published : July 21, 2010
Open Document
Text Preview
• Introduction • Emotion mouse • Emotion and computing • Theory • Result • Manual and gaze input cascaded (magic) pointing • Eye tracker • Implementing magic pointing • Artificial intelligent speech recognition • Application

• The simple user interface tracker • conclusion

Introduction :Imagine yourself in a world where humans interact with computers. You are sitting in front of your personal computer that can listen, talk, or even scream aloud. It has the ability to gather information about you and interact with you through special techniques like facial recognition, speech recognition, etc. It can even understand your emotions at the touch of the mouse. It verifies your identity, feels your presents, and starts interacting with you .You ask the computer to dial to your friend at his office. It realizes the urgency of the situation through the mouse, dials your friend at his office, and establishes a connection. The BLUE EYES technology aims at creating computational machines that have perceptual and sensory ability like those of human beings.Employing most modern video cameras and microphones to identifies the users actions through the use of imparted sensory abilities . The machin

can understand what a user wants, where he is looking at, and even realize his physical or emotional states.

Emotion mouse:One goal of human computer interaction (HCI) is to make an adaptive, smart computer system. This type of project could possibly include gesture recognition, facial recognition, eye tracking, speech recognition, etc. Another non-invasive way to obtain information about a person is through touch. People use their computers to obtain, store and manipulate data using their computer. In order to start creating smart computers, the computer must start gaining information about the user. Our proposed method for gaining user information through touch is via a computer input device, the mouse. From the physiological data obtained from the user, an emotional state may be determined which would then be related to the task the user is currently doing on the computer. Over a period of time, a user model will be built in order to gain a sense of the user's personality. The scope of the project is to have the computer adapt to the user in order to create a better working environment

where the user is more productive. The first steps towards realizing this goal are described here.

Emotion and computing:Rosalind Picard (1997) describes why emotions are important to the computing community. There are two aspects of affective computing: giving the computer the ability to detect emotions and giving the computer the ability to express emotions. Not only are emotions crucial for rational decision making.but emotion detection is an important step to an adaptive computer system. An adaptive, smart computer system has been driving our efforts to detect a person’s emotional state. By matching a person’s emotional state and the context of the expressed emotion, over a period of time the person’s personality is being exhibited. Therefore, by giving the computer a longitudinal understanding of the emotional state of its user, the computer could adapt a working style which fits with its user’s personality. The result of this collaboration could increase productivity for the user. One way of gaining information from a user nonintrusively is by video. Cameras have been used to detect a person’s emotional state. We have explored gaining information

through touch. One obvious place to put sensors is on the mouse.

Theory:Based on Paul Ekman’s facial expression work, we see a correlation between a person’s emotional state and a person’s physiological measurements. Selected works from Ekman and others on measuring facial behaviors describe Ekman’s Facial Action Coding System (Ekman and Rosenberg, 1997). One of his experiments involved participants attached to devices to record certain measurements including pulse,...
tracking img