What is Haptics?
Mandayam A Srinivasan© Director Laboratory for Human and Machine Haptics: The Touch Lab Massachusetts Institute of Technology http://touchlab.mit.edu
1. Introduction Haptics refers to sensing and manipulation through touch. Since the early part of twentieth century, the term haptics has been used by psychologists for studies on the active touch of real objects by humans. In the late nineteen-eighties, when we started working on novel machines pertaining to touch, it became apparent that a new discipline was emerging that needed a name. Rather than concocting a new term, we chose to redefine haptics by enlarging its scope to include machine touch and human-machine touch interactions. Our working definition of haptics includes all aspects of information acquisition and object manipulation through touch by humans, machines, or a combination of the two; and the environments can be real, virtual or teleoperated. This is the sense in which substantial research and development in haptics is being pursued around the world today.. In order to organize the rapidly increasing multidisciplinary research literature, it is useful to define sub-areas of haptics. Haptics can be subdivided into three areas 1. human haptics - the study of human sensing and manipulation through touch, 2. machine haptics – the design, construction, and use of machines to replace or augment human touch. 3. computer haptics -algorithms and software associated with generating and rendering the touch and feel of virtual objects (analogous to computer graphics). Consequently, multiple disciplines such as biomechanics, neuroscience, psychophysics, robot design and control, mathematical modeling and simulation, and software engineering converge to support haptics. Wide varieties of applications have emerged and span many areas of human needs such as product design, medical trainers, and rehabilitation. Haptics is poised for rapid growth. Just as the primitive man forged hand tools to triumph over harsh nature, we need to develop smart devices to interface with information-rich real and virtual worlds. Given the ever-increasing quantities and types of information that surrounds us, and to which we need to respond rapidly, there is a critical need to explore new ways to interact with information. In order to be efficient in this interaction, it is essential that we utilize all of our sensorimotor capabilities. Our haptic system – with its tactile, kinesthetic, and motor capabilities together with the associated cognitive processes – presents a uniquely bi-directional information channel to our brains, yet it remains underutilized. If we add force and/or distributed tactile feedback of sufficient range,
resolution and frequency bandwidth to match the capabilities of our hands and other body parts, a large number of applications open up, such as haptic aids for a blind user surfing the net or a surgical trainee perfecting his trade. Ongoing engineering revolutions in information technology and the miniaturization of sensors and actuators are bringing this dream ever closer to reality. Virtual environments (VEs), generally referred to as virtual reality in the popular press, have caught the imagination of lay public as well as researchers working in a wide variety of disciplines. VEs are computer-generated synthetic environments with which a human user can interact to perform perceptual and motor tasks. A typical VE system consists of a helmet that can project computer-generated visual images and sounds appropriate to the gaze direction, and special gloves with which one can command a computer through hand gestures. The possibility that by wearing such devices, one could be mentally transported to and immersed in virtual worlds built solely through software is both fascinating and powerful. Applications of this technology include a large variety of human activities such as training, education, entertainment, health care, scientific...
References: Note: Most of the articles listed below are our previous review articles, which, in turn, have references to works by us and others for a deeper study of haptics. Many of these documents are available in downloadable pdf format at http://touchlab.mit.edu 1. Salisbury, J K and Srinivasan, M A, Sections on Haptics, In Virtual Environment Technology for Training, BBN Report No. 7661, Prepared by The Virtual Environment and Teleoperator Research Consortium (VETREC), MIT, 1992. 2. Srinivasan M A, Sections on Haptic Perception and Haptic Interfaces, In Bishop G, et al., Research Directions in Virtual Environments: Report of an NSF Invitational Workshop, Computer Graphics, Vol. 26, No. 3, pp. 1992. 3. Srinivasan, M A, Haptic Interfaces, In Virtual Reality: Scientific and Technical Challenges, Eds: N. I. Durlach and A. S. Mavor, Report of the Committee on Virtual Reality Research and Development, National Research Council, National Academy Press, 1995. 4. Srinivasan, M A and Basdogan, C, Haptics in Virtual Environments: Taxonomy, Research Status, and Challenges, Computers and Graphics, Vol. 21, No. 4, 1997. 5. Salisbury, J K and Srinivasan, M A, Phantom-Based Haptic Interaction with Virtual Objects, IEEE Computer Graphics and Applications, Vol. 17, No. 5, 1997. 6. Srinivasan, M A, Basdogan, C, and Ho, C-H, Haptic Interactions in the Real and Virtual Worlds, Design, Specification and Verification of Interactive Systems ‘99, Eds: D. Duke and A. Puerta, Springer-Verlag Wien, 1999.
7. Biggs, S J and Srinivasan, M A, Haptic Interfaces, Virtual Environment Handbook, Ed: KM Stanney, Lawrence Erlbaum Associates, Ch. 5, pp. 93-116, 2002. 8. Basdogan, C and Srinivasan, M A, Haptic Rendering in Virtual Environments, Virtual Environment Handbook, Ed: KM Stanney, Lawrence Erlbaum Associates, Ch. 6, pp. 117-134, 2002. 9. Basdogan C, De, S, Kim, J, Muniyandi, M, Kim, H, and Srinivasan, M A. Haptics in Minimally Invasive Surgical Simulation and Training, IEEE Computer Graphics and Applications, Vol. 24, No. 2, pp. 56-64, 2004. 10. Salisbury, K, Conti, F, and Barbagli, F. Haptic rendering: Introductory concepts, IEEE Computer Graphics and Applications, Vol. 24, No. 2, pp. 24-32, 2004. 11. Kim, J, Kim, H, Tay, B K, Muniyandi, M, Jordan, J, Mortensen, J, Oliveira, M, Slater, M, Transatlantic Touch: A Study of Haptic Collaboration over Long Distance, Presence: Teleoperators & Virtual Environments, Vol. 13, No. 3, pp: 328 – 337, 2004. 12. Wessberg, J, Stambaugh, C R, Kralik, J D, Beck, P D, Laubach, M, Chapin, J K, Kim, J, Biggs, S J, Srinivasan, M A, and Nicolelis, M A L. Real-time prediction of hand trajectory by ensembles of cortical neurons in primates, Nature. 408:361-5, 2000.
Please join StudyMode to read the full document