Design of Immersivetouch

Only available on StudyMode
  • Download(s) : 97
  • Published : February 22, 2013
Open Document
Text Preview
Design of the ImmersiveTouch™: a High-Performance Haptic Augmented Virtual Reality System Cristian Luciano, Pat Banerjee, Lucian Florea, Greg Dawe Electronic Visualization Laboratory Industrial Virtual Reality Institute University of Illinois at Chicago 842 West Taylor St., Chicago, IL 60607 {clucia1, banerjee, lflore11};

ImmersiveTouch™ is the next generation of augmented virtual reality technology, being the first system that integrates a haptic device, with a head and hand tracking system, and a high-resolution and high-pixel-density stereoscopic display. Its ergonomic design provides a comfortable working volume in the space of a standard desktop. The haptic device is collocated with the 3D graphics, giving the user a more realistic and natural means to manipulate and modify 3D data in real time. The high-performance, multi-sensorial computer interface allows easy development of medical, dental, engineering or scientific virtual reality simulation and training applications that appeal to many stimuli: audio, visual, tactile and kinesthetic.



ImmersiveTouch™ 1,2 is a new haptics-based high-resolution augmented virtual reality system that provides an efficient way to display and manipulate three-dimensional data for training and simulation purposes. It is a complete hardware and software solution (Figure 1). The hardware integrates 3D stereo visualization, force feedback, head and hand tracking, and 3D audio. The software provides a unified API (Applications Programming Interface) to handle volume processing, graphics rendering, haptics rendering, 3D audio feedback, interactive menus and buttons. This paper describes the design process of the hardware as well as the software of the ImmersiveTouch™ prototype. The problems of current virtual reality systems and how they motivated the design of this system will be explained in the following section. The hardware constraints considered to achieve the optimal placement of its components will be described in section 3. How the ImmersiveTouch™ API provides an easy workbench to develop haptics-based virtual reality applications integrating a set of C++ libraries will be clarified in section 4. The calibration procedure needed for a correct graphics/haptics collocation will be described in section 5. Finally, the system performance and possible future improvements will be seen in section 6. Figure 1: The ImmersiveTouch™ prototype

1 2

™ Board of Trustees of the University of Illinois Patent pending


Background and previous research

Rear-projection-based virtual reality (VR) devices, including the CAVE® [4] and the ImmersaDesk® [5], create a virtual environment projecting stereoscopic images on screens located between the users and the projectors. These displays suffer from occlusion of the image by the user’s hand or any interaction device located between the user’s eyes and the screens. When a virtual object is located close to the user, the user can place his/her hand “behind” the virtual object. However, the hand will always look “in front” of the virtual object because the image of the virtual object is projected on the screen. This visual paradox confuses the brain and breaks the stereoscopic illusion. Augmented reality displays are more suitable for hapticsbased applications because, instead of projecting the images onto physical screens, they use half-silvered mirrors to create virtual projection planes that are collocated with the haptic device workspaces. The user’s hands, located behind the mirror, are integrated with the virtual space and provide a natural means of interaction. The user can still see his/her hands without occluding the virtual objects. Another problem of regular VR devices displaying stereo images is known as the “accommodation/convergence conflict” [1] (Figure 2). The accommodation is the muscle tension needed to change the focal length of the eye lens in order to focus at a...
tracking img