Algorithms have been developed to provide haptic rendering of three-dimensional (3D) objects in virtual (that is, computationally simulated) environments. The goal of haptic rendering is to generate tactual displays of the shapes, hardnesses, surface textures, and frictional properties of 3D objects in real time. Haptic rendering is a major element of the emerging field of computer haptics, which invites comparison with computer graphics. We have already seen various applications of computer haptics in the areas of medicine (surgical simulation, telemedicine, haptic user interfaces for blind people, and rehabilitation of patients with neurological disorders), entertainment (3D painting, character animation, morphing, and sculpting), mechanical design (path planning and assembly sequencing), and scientific visualization (geophysical data analysis and molecular manipulation).

Some elements of the collision-detection algorithms used in computer graphics can be used in computer haptics. For example, haptic-rendering algorithms can easily take advantage of space-partitioning, local-searching, and hierarchical-data-structure techniques of computer graphics to reduce the amount of computation time needed to detect collisions. However, mere detection of collisions as in computer graphics is not enough because how a collision occurs and how it evolves over time are factors that must be taken into account to compute interaction forces accurately. Going beyond computer-graphics collision-detection algorithms, it is necessary to develop algorithms according to a client-server model to provide for synchronization of visual and haptic displays in order to make update rates acceptably high. For example, by use of multithreading techniques, one can calculate the contact forces at rate of 1 kHz in one thread while updating visual images at 30 Hz in another thread.
This work was done by Cagatay Basdogan of Caltech, Chih-Hao Ho of Cambridge Research Associates, and Mandayam Srinavasan of MIT for NASA's Jet Propulsion Laboratory. For further information, access the Technical Support Package (TSP) free on-line at www.techbriefs.com/tsp under the Information Sciences category.
This software is available for commercial licensing. Please contact Don Hart of the California Institute of Technology at (818) 393-3425. Refer to NPO-21191.
This Brief includes a Technical Support Package (TSP).

Algorithms for Haptic Rendering of 3D Objects
(reference NPO-21191) is currently available for download from the TSP library.
Don't have an account?
Overview
The document is a technical support package from NASA's Jet Propulsion Laboratory (JPL) detailing advancements in haptic rendering algorithms for 3D objects in virtual environments. Authored by Cagatay Basdogan, Chih-Hao Ho, and Mandayam Srinivasan, the report highlights the development of software algorithms that enable users to touch, feel, and manipulate virtual objects through a haptic interface, which provides force feedback.
The primary motivation behind this work is the need for realistic interaction in virtual environments, which has numerous applications, including medical training, simulation, and design. The algorithms presented are designed to be significantly faster than existing methods, employing a novel approach to collision detection and response. This involves two main components: collision detection, where the position and orientation of the haptic device are tracked to identify interactions with virtual objects, and collision response, which computes the interaction forces based on predefined rules and conveys them to the user.
The document emphasizes the importance of haptic feedback in enhancing the user experience by providing a tactile representation of 3D objects and their surface details. This capability is crucial for applications such as minimally invasive surgical training, where understanding the texture and resistance of tissues can improve the effectiveness of training simulations.
The report also includes references to various studies and techniques related to haptic rendering, collision detection, and virtual environments, showcasing the breadth of research in this field. It notes that the work was conducted under NASA contract and does not imply any endorsement of specific products or manufacturers.
Overall, the document serves as a comprehensive overview of the state-of-the-art in haptic rendering technology, illustrating how these algorithms can transform virtual interactions and enhance training and simulation experiences across various domains. The advancements discussed are positioned to significantly impact fields that rely on realistic simulations, making this research a valuable contribution to the ongoing development of virtual reality technologies.

