PAPER PRESENTATION ON
“HAPTICS”-- a technology that adds the sense of touch to virtual environment .Haptic interfaces allow the user to feel as well as to see virtual objects on a computer, and so we can give an illusion of touching surfaces, shaping virtual clay or moving objects around. The sensation of touch is the brain’s most effective learning mechanism --more effective than seeing or hearing—which is why the new technology holds so much promise as a teaching tool. Haptic technology is like exploring the virtual world with a stick. If you push the stick into a virtual balloon push back .The computer communicates sensations through a haptic interface –a stick, scalpel, racket or pen that is connected to a force-exerting motors. With this technology we can now sit down at a computer terminal and touch objects that exist only in the "mind" of the computer.By using special input/output devices (joysticks, data gloves, or other devices), users can receive feedback from computer applications in the form of felt sensations in the hand or other parts of the body. In combination with a visual display, haptics technology can be used to train people for tasks requiring hand-eye coordination, such as surgery and space ship maneuvers. In this paper we explicate how sensors and actuators are used for tracking the position and movement of the haptic device moved by the operator. We mention the different types of force rendering algorithms. Then, we move on to a few applications of Haptic Technology. Finally we conclude by mentioning a few future developments.
1. What is Haptics?
Haptics refers to sensing and manipulation through touch. The word comes from the Greek ‘haptesthai’, meaning ‘to touch’. The history of the haptic interface dates back to the 1950s, when a master-slave system was proposed by Goertz (1952). Haptic interfaces were established out of the field of tele- operation, which was then employed in the remote manipulation of radioactive materials. The ultimate goal of the tele-operation system was "transparency". That is, an user interacting with the master device in a master-slave pair should not be able to distinguish between using the master controller and manipulating the actual tool itself. Early haptic interface systems were therefore developed purely for telerobotic applications. Working of Haptic Devices
Architecture for Haptic feedback: [pic]
Basic architecture for a virtual reality application incorporating visual, auditory, and haptic feedback. • Simulation engine:
Responsible for computing the virtual environment’s behavior over time. • Visual, auditory, and haptic rendering algorithms:
Compute the virtual environment’s graphic, sound, and force responses toward the user. • Transducers:
Convert visual, audio, and force signals from the computer into a form the operator can perceive. • Rendering:
Process by which desired sensory stimuli are imposed on the user to convey information about a virtual haptic object.
The human operator typically holds or wears the haptic interface device and perceives audiovisual feedback from audio (computer speakers, headphones, and so on) and visual displays (a computer screen or head-mounted display, for example). Audio and visual channels feature unidirectional information and energy flow (from the simulation engine towards the user) whereas, the haptic modality exchanges information and energy in two directions, from and toward the user. This bi directionality is often referred to as the single most important feature of the haptic interaction modality. System architecture for haptic rendering:
An avatar is the virtual representation of the haptic interface through which the user physically interacts with the virtual environment....
Please join StudyMode to read the full document