Principle of Haptic Perception

Table of Content

Psychologists have been studying haptics, the sensing and manipulation through touch, since the early 1900s. In the late 1980s, with the creation of machines specifically designed for touch, it became evident that a new field was emerging. Rather than introducing a separate term, we chose to broaden the definition of haptics to include machine touch and interactions between humans and machines.

The definition of haptics we are using includes all aspects of touch-based information acquisition and object manipulation by humans, machines, or a combination of both. These interactions can occur in real, virtual, or teleoperated environments. This broad understanding of haptics has led to extensive global research and development efforts. To better organize the rapidly growing interdisciplinary research literature in this field, it is useful to define sub-areas of haptics. Haptics can be categorized as follows:
1. Human haptics focuses on studying how humans sense and manipulate objects through touch.
2. Machine haptics involves designing, constructing, and using machines to replace or enhance human touch.
3. Computer haptics deals with algorithms and software that create the sensation of touch in virtual objects (similar to computer graphics).
As a result, various disciplines such as biomechanics, neuroscience, psychophysics, robot design and control, mathematical modeling and simulation,and software engineering contribute to the field of haptics. This has resulted in a wide range of applications addressing different human needs like product design,m medical training,and rehabilitation.

This essay could be plagiarized. Get your custom essay
“Dirty Pretty Things” Acts of Desperation: The State of Being Desperate
128 writers

ready to help you now

Get original paper

Without paying upfront

Haptics is on the verge of experiencing accelerated growth. Similar to how early humans created hand tools to conquer a challenging environment, we must invent intelligent devices to connect with both the physical and digital worlds that are filled with abundant information. With the continuous expansion of information quantity and diversity, which demands prompt reactions, it becomes crucial to investigate novel methods of interacting with this information. To optimize this interaction process, it is vital for us to make full use of our sensory and physical abilities.

The haptic system encompasses tactile, kinesthetic, and motor capabilities as well as cognitive processes. It offers a distinct information pathway to our brains that isn’t fully tapped into. By integrating force and distributed tactile feedback that aligns with the capacities of our hands and body parts, numerous applications can be realized. For instance, haptic aids have the potential to aid visually impaired individuals in internet browsing or enhance the abilities of surgical trainees.

Virtual reality, also known as virtual environments (VEs), is being brought closer to us through advancements in information technology and the miniaturization of sensors and actuators. These synthetic environments, generated by computers, enable human users to interact and engage in perceptual and motor activities. They have captured the interest of both the general public and researchers across various fields.

A Virtual Reality (VR) system typically consists of a helmet that displays computer-generated visuals and sounds based on where the user is looking, along with gloves that allow users to control a computer using hand gestures. The ability to mentally immerse individuals in virtual worlds created through software has immense potential and impact. This technology is utilized in various domains such as training, education, entertainment, healthcare, scientific visualization, telecommunication, design, manufacturing, and marketing.

Virtual environment systems that solely stimulate the visual and auditory senses have limitations in their capacity to engage the user. Similar to real life experiences, it is desirable to incorporate the haptic sensorimotor system, which not only delivers a sense of touch and feel but also enables object manipulation. The human hand can execute diverse actions like pressing, grasping, squeezing, and stroking objects. It can investigate qualities such as surface texture, shape, and softness while also having the ability to manipulate tools like pens or jack-hammers.

The combination of physical engagement, visual and auditory perception in an environment creates a captivating experience. Lack of touch in both real and virtual environments limits human interaction. To enhance immersion in a virtual setting, it is more effective to integrate a basic haptic interface with visual and auditory displays rather than solely focusing on improving the visuals.

Haptic interfaces are devices that allow people to interact manually with virtual environments or remotely operated systems. These devices are used for activities that typically involve using hands in the physical world, like exploring and manipulating objects. In other words, users give motor commands and receive corresponding tactile feedback. Sometimes, haptic interactions can incorporate other sensory stimuli like sight and sound.

While computer keyboards, mice, trackballs, and instrumented gloves available in the market are considered simple haptic interfaces, they can only transmit user commands to the computer and cannot provide a realistic sense of touch to the user. However, recent advancements in force-reflecting haptic interface hardware and haptic rendering software have generated significant enthusiasm. This technology is becoming more refined and has opened up new and intriguing areas of research.

To fully support the numerous existing applications of haptics and explore potential new applications, it is crucial to comprehend the mechanics of touch interaction, including our perception and manipulation capabilities, as well as their impact on task performance. Thus, the field of haptics research faces a two-fold challenge: attaining an in-depth scientific understanding of our haptic sensorimotor system and creating suitable haptic interface technology.

This short introductory document primarily gives an overview of the main subareas of haptics. We also direct the reader to our more detailed reviews, which include references to works by us and others, for a more comprehensive understanding. The first section covers the fundamentals of human perception and how to replicate the sensation. Following that, we provide a basic introduction to human haptics, which is the study of the human sensorimotor system related to manual exploration and manipulation. The subsequent section focuses on machine haptics, which deals with electromechanical devices used as haptic interfaces.

Next, the concept of Computer Haptics is explained and sources for review papers on haptic interactions, including paradigms, algorithms, and software, are mentioned. The text also discusses various interesting applications of haptics, like the creation of medical simulators and virtual environments for multiple users. Lastly, the future challenges and opportunities in the field of haptics are briefly discussed. In relation to touching real and virtual objects, when a person touches an object with their bare skin or through a tool, it causes forces on their skin.

The nervous system conveys sensory information from sensors in the skin, joints, tendons, and muscles to the brain, resulting in haptic perception. The brain then activates muscles, leading to hand and arm motion that modifies the touch sensory information. This sensorimotor loop occurs during both exploration and manipulation of objects. To simulate the sensation of touching virtual objects, we must generate the reaction force of objects on the skin.

The use of a force reflecting haptic interface device allows users to simulate touching a real object through a tool. When the user manipulates the end-effector of the haptic interface device, position sensors convey its tip position to the computer. In real-time, the computer models calculate torque commands to the actuators on the haptic interface, resulting in the application of appropriate reaction forces on the user. This leads to the perception of virtual objects through haptic feedback. 3. Human Haptics

To develop haptic interfaces that optimize interactions with human users, it is important to comprehend the functions of the mechanical, sensory, motor, and cognitive subsystems within the human haptic system. The mechanical structure of the human hand comprises a complex arrangement of 19 bones linked by nearly as many frictionless joints, enveloped by soft tissues and skin. Tendons connect the bones to around 40 intrinsic and extrinsic muscles, enabling activation of 22 degrees of freedom in the hand.

The sensory system consists of receptors and nerve endings located in the skin, joints, tendons, and muscles. When these receptors are stimulated by mechanical, thermal, or chemical stimuli, they send electrical impulses through the afferent neural network to the central nervous system (including the brain). Subsequently, the central nervous system utilizes efferent neurons to transmit instructions to the muscles for achieving the intended motor action.

When engaging in any activity that requires touching an object, whether it’s for exploration or manipulation, the properties of the skin and tissues beneath it play a crucial role in achieving success. One key example is the fingerpad, which is utilized by primates in a wide range of precise tasks. The fingerpad consists of ridged skin that is approximately 1 mm thick, encompassing soft tissues that are primarily composed of semi-liquid fat. As a material entity, the fingerpad demonstrates intricate mechanical characteristics such as non-uniformity, direction-dependent behavior, and responsiveness to rate and time variations.

The skin’s compliance and frictional properties, combined with the hand’s sensory and motor abilities, enable smooth gliding over surfaces while maintaining contact. Moreover, these characteristics facilitate stable manipulation of smooth objects. The mechanical loading on the skin and transmission of mechanical signals through it heavily rely on the mechanical properties of both the skin and subcutaneous tissues. Additionally, cutaneous mechanoreceptors play a vital role in converting these signals.

The information received by the brain from the hand’s contact with an object can be divided into two groups: tactile information and kinesthetic information. Tactile information is about the type of contact with the object and comes from low threshold mechanoreceptors in the skin, particularly in and around the fingerpad. Kinesthetic information refers to limb position, movement, and associated forces, which are detected by sensory receptors in the skin near joints, joint capsules, tendons, muscles, as well as neural signals from motor commands.

When a passive hand touches an object without movement, only tactile information is transmitted. However, there is always constant kinesthetic information about limb posture present. On the other hand (no pun intended), during active hand motion without any contacts or interaction with other parts of the skin or objects, only kinesthetic information is relayed. The absence of tactile information alone can indicate unrestricted motion. Nonetheless, both types of information play a role in all sensory and manipulatory tasks performed actively with a normal hand.

The presence of free nerve endings and specialized receptors in the skin allows for the detection of various sensations such as temperature, pain (mechanical, thermal, chemogenic), and itch. Controlling contact conditions is crucial for successfully completing tasks. Humans can use fast muscle or spinal reflexes as well as slower conscious deliberate actions to maintain control. Studies involving lifting objects held in a pinch grasp demonstrate that motor actions like increasing grip force can be initiated within 70 msec after the object starts slipping from the fingerpad. The sensory signals from cutaneous afferents are vital for task performance. The mechanical properties of the skin and subcutaneous tissues, along with continuous monitoring by various sensors and integration with motor system actions, contribute to human abilities in grasping and manipulation.

References to relevant literature on human haptic abilities in real-world settings can be found in [3] and [7]. Machine haptics, which involves the development and use of machines that can replace or enhance human touch, encompasses robots operated independently or remotely. However, our focus is specifically on haptic interfaces for virtual environments. Haptic interfaces are composed of mechanical components that physically interact with the human body to facilitate communication with the nervous system.

The haptic interface allows the user to physically manipulate it to perform tasks. It also provides sensory information by stimulating the user’s tactile and kinesthetic senses. Overall, haptic interfaces have two main functions: measuring the user’s hand positions and contact forces, and displaying contact forces and positions to the user.

Among these position (or, kinematic) and contact force variables, the selection of motor action variables (i.e., inputs to the computer) and sensory display variables (i.e., inputs to the human) depends on hardware, software design, and interface tasks. Currently, most forcereflecting haptic interfaces detect the position of their end-effector and present forces to the user. Ultimately, human capabilities and limitations determine the performance criteria for haptic devices.

Simulation of haptic interactions with virtual environments (VEs) that aim to replicate real-world settings will never be exact. The limitations of human capabilities will dictate which approximations are acceptable. The desired qualities of force-feedback haptic interfaces are as follows:
1. Minimal inertia and friction that restrict motion, allowing for a sense of unrestricted movement.
2. The range, precision, and speed of position detection and force reflection should align with human capacities for the specific tasks the haptic interface is used in.

The user should not be able to pass through solid objects by exceeding the force range of the interface. Additionally, the user should not experience unintended vibrations caused by the quantization of position or a low servo rate. Furthermore, the user should not perceive stiff objects as soft due to low structural and servo stiffness. Meeting these conditions is challenging due to the fine sensitivity and approximately 1 kHz bandwidth of the human tactile system. However, the fact that the human control bandwidth in haptic interactions is only around 10 Hz can be beneficial.

Ensuring ergonomics and comfort is crucial in haptic interfaces as it surpasses all other sensations. [1], [3], and [7] provide a comprehensive survey of the developed haptic interface devices. Our MIT Touch Lab has created device hardware, interaction software, and psychophysical experiments relating to haptic interactions with virtual environments. We have specifically developed two specialized devices, the linear and planar graspers, for conducting psychophysical experiments.

The linear grasper can imitate important mechanical properties like compliance, viscosity, and mass in haptic interactions. The planar grasper can simulate virtual walls and corners, as well as two springs within its workspace. The PHANTOM® device, created at MIT, has been utilized for prototyping diverse force-based haptic display features. Numerous rendering algorithms have been implemented on the PHANTOM to display the shape, compliance, texture, and friction of solid surfaces.

All three devices, including the linear and planar graspers and the PHANTOM, have been utilized in psychophysical experiments. These experiments aim to understand the sensory and motor abilities of human users and evaluate the efficacy of rendering algorithms for conveying object properties. Computer Haptics is an evolving field of research focused on generating and presenting the tactile sensations of virtual objects to human operators using force feedback devices.

Similar to computer graphics, haptic rendering focuses on modeling and behavior of virtual objects, as well as real-time display algorithms. It encompasses the necessary software architecture for haptic interactions and their synchronization with visual and other display methods. The process of haptically rendering objects in virtual environments with a force feedback device involves two main components: collision detection and collision response. When the user manipulates the haptic device, sensors in the device detect the new position and orientation of the probe. If there is no collision between the simulated probe and virtual objects, the haptic interface device remains passive and does not exert any forces on the user. However, if a collision is detected, the mechanistic model calculates the reaction force based on the depth of penetration of the probe into the virtual object.

The force vectors calculated can be adjusted by mapping them onto the surface of the object to consider its details. These adjusted force vectors are then sent back to the user using the haptic device. This process of detecting collisions and providing a response is called a haptic loop, which must run continuously at approximately 1,000 times per second. Otherwise, virtual surfaces will feel less firm, or in the worst case, instead of feeling a surface, the user will perceive vibrations from the haptic device.

In recent times, various techniques have been developed for rendering virtual objects with haptic feedback. Similar to computer graphics, 3D objects can be represented as either surfaces or volumes in computer haptics. Surface models utilize parametric or polygonal representations, whereas volumetric models are constructed using voxels.

The different techniques for haptic rendering with force display can be categorized based on how the probing object is represented: (1) point-based, where the probe is represented as a point (similar to exploring and manipulating real objects with the tip of a stick), (2) ray-based, where the probe is represented as a line segment (similar to exploring and manipulating real objects with the entire length of a stick in addition to its tip), or (3) a 3D-object, where the probe is composed of a collection of points, line segments, and polygons.

The type of interaction method used in simulations depends on the needs and complexity of the application. Today, there are many algorithms available to render the shape, surface texture, softness, and dynamics of virtual objects. To learn more about the issues and algorithms related to haptic rendering, please refer to [4], [5], [8], [9], and [10].

5. Applications
The integration of haptics into various virtual reality and teleoperation applications brings about exciting possibilities. In our Touch Lab, we have pursued three example applications which are summarized below.

Medical Simulators: Similar to how flight simulators train pilots, our multimodal virtual environment system is being used to develop virtual reality-based needle procedures and surgical simulators that allow medical trainees to visualize, touch, and manipulate realistic models of biological tissues and organs. This work involves the creation of instrumented hardware and software algorithms for real-time displays. Residents and experts from two hospitals have already tested an epidural injection simulator.

A minimally invasive surgery simulator is being developed and includes (a) in vivo measurement of the mechanical properties of tissues and organs, (b) the development of real-time algorithms for computing tool-tissue force interactions and organ deformations, and (c) verifying the effectiveness of the simulator for training purposes. This work is reviewed in [9].

In another project, collaborative haptics is being explored to enhance human-computer interaction and human-human interactions mediated by computers.

A study was conducted to investigate the impact of haptic feedback on collaborative tasks in a multimodal shared virtual environment system. Human subjects were involved in experiments to determine if haptic communication through force feedback could enhance the sense of presence and collaboration with a remote partner. The study included two scenarios: one where the partners were physically close, and another where they were separated by a significant distance (transatlantic touch with collaborators in University College, London [11]). Additionally, a collaborative project on Brain Machine Interfaces was undertaken with Prof.

Nicolelis of Duke University Medical School has recently achieved the feat of controlling a robot in real-time by using neuronal signals from approximately 100 neurons located in the monkey’s motor cortex [12]. This breakthrough not only occurred within Duke’s premises but also across the internet, utilizing a robot in our laboratory. The significance of this accomplishment lies in its ability to revolutionize the study of sensorimotor functions within the Central Nervous System. Moreover, it paves the way for potential implementation of brain-machine interfaces that could be implanted in paralyzed patients, enabling them to control external devices like smart prostheses. Such interfaces could serve a similar purpose as pacemakers or cochlear implants. Additionally, there are numerous other potential applications to consider, including manipulating micro and macro robots for minimally invasive surgery in the field of medicine, remote diagnosis for telemedicine, and the development of aids for individuals with disabilities such as haptic interfaces designed specifically for the visually impaired.

Entertainment: video games and simulators that allow users to experience and control virtual objects, liquids, tools, and avatars.

Education: providing students with hands-on experiences of phenomena at different scales, from nanotechnology to astronomy; exploring hypothetical scenarios in physics outside of Earth; interacting with complex data sets.

Industry: integrating haptics into computer-aided design (CAD) systems, enabling designers to manipulate mechanical components within an immersive environment.

Graphic Arts: virtual art exhibitions, concert venues, and museums where users can remotely access and interact with musical instruments and experience the haptic aspects of displays; collaborative virtual sculpting over the internet.

Future Challenges and Opportunities: Many of the challenges and opportunities in the development of haptic interfaces and computer haptics have been discussed in our previous survey articles.

Despite the availability of both ground-based and exoskeletal force-reflecting haptic interface devices in the market, there is still a need for enhancements in the range, resolution, and frequency bandwidth of these devices to meet the performance level of human users. One of the desired improvements is the capability to reflect torques in addition to forces, as well as having enough degrees of freedom to allow for grasping and two-handed manipulation of objects. Among the technologies that need to be developed, tactile displays pose the greatest challenge in creating realistic haptic displays that replicate direct natural touch.

The emerging field of micro-mechanical systems offers potential for delivering precise arrays of tactile stimulators. While these actuators can produce relatively small forces and deflections, combining them with addressing electronics is predicted to result in affordable, lightweight, and compact arrays that can be worn without hindering user mobility. In computer haptics, the existing models of real-time haptic display for virtual objects are much simpler compared to the complex static and dynamic behavior of real-world objects.

Despite the current increase in processing speeds, the challenge of creating computationally efficient models and interaction techniques that can provide real-time haptic displays matching human perception in accuracy and resolution will persist. This is due to the potential complexity of the models, such as detecting collisions between multiple moving objects or conducting real-time mechanistic analysis of deformable objects, which can be exceedingly high.

Synchronization of the visual, auditory and haptic displays is difficult due to the different types of approximations required for each modality to simulate the same physical phenomenon. It is essential to use multiple processors with shared memory and/or multi-threading. To make haptics across the internet useful to a large number of users, standardized protocols for distributed VEs should explicitly include haptics. Haptic interfaces can only approximate our interactions with the real environment due to inherent hardware limitations.

Despite this, it is not implied that the haptic experiences generated through haptic interfaces would automatically feel artificial to the user. To illustrate this, let’s draw a comparison with the synthesized visual experiences we encounter when watching television or playing video games. In reality, visual stimuli are continuous and span across space and time, while these visual interfaces display images at a rate of approximately 30 frames per second. Nevertheless, we still perceive a sense of authenticity and even a feeling of being present in another place because we adapt to the constraints of the human visual system.

The hope is that the required approximations in generating synthesized haptic experiences will be sufficient for a specific task, as the human haptic system also has limitations that can be exploited in a similar manner. To determine these approximations and understand what is feasible in creating synthetic haptic experiences, quantitative human studies tightly integrated with technology development are crucial. These studies aim to assess which types of stimulation offer the most valuable and significant haptic cues for the task at hand.

Acknowledgments

This document has been compiled using significant excerpts from my previous review papers, which are listed below and appropriately cited. Numerous colleagues and students, who are too many to individually mention, have contributed to the work conducted in our MIT Touch Lab and summarized in the aforementioned review papers. I would like to express my appreciation to my coauthors of the reviews, particularly Ken Salisbury, Cagatay Basdogan, and Chih-Hao Ho, for their collaboration on various haptics related projects. SensAble Technologies, Inc. holds the registered trademark for PHANTOM. The author retains the copyrights for this article.

References:

Please note that the articles mentioned below are our previous reviews, which contain references to both our work and the work of others for a more comprehensive exploration of haptics. You can find many of these documents in downloadable pdf format on our website: http://touchlab.mit.edu
1. Salisbury, J K and Srinivasan, M A, Sections on Haptics, In Virtual Environment Technology for Training, BBN Report No. 7661, Prepared by The Virtual Environment and Teleoperator Research Consortium (VETREC), MIT, 1992.
2.

Srinivasan M A has written several publications related to haptic perception and haptic interfaces. Some of these publications include “Sections on Haptic Perception and Haptic Interfaces” in “Research Directions in Virtual Environments: Report of an NSF Invitational Workshop” by Bishop G et al. (1992), “Haptic Interfaces” in “Virtual Reality: Scientific and Technical Challenges” edited by N. I. Durlach and A. S. Mavor (1995), “Haptics in Virtual Environments: Taxonomy, Research Status, and Challenges” in “Computers and Graphics” (1997) by Srinivasan M A and Basdogan C, and “Phantom-Based Haptic Interaction with Virtual Objects” in “IEEE Computer Graphics and Applications” (1997) by Salisbury J K and Srinivasan M A.

6. Srinivasan, M A, Basdogan, C, and Ho, C-H, Haptic Interactions in the Real and Virtual Worlds, Design, Specification and Verification of Interactive Systems ‘99, Eds: D. Duke and A. Puerta, Springer-Verlag Wien, 1999.

7. Biggs, S J and Srinivasan, M A, Haptic Interfaces, Virtual Environment Handbook, Ed: KM Stanney, Lawrence Erlbaum Associates, Ch. 5, pp. 3-116, 2002.

8. Basdogan, C and Srinivasan, M A, Haptic Rendering in Virtual Environments, Virtual Environment Handbook, Ed: KM Stanney, Lawrence Erlbaum Associates, Ch. 6, pp. 117-134, 2002.

9. Basdogan C, De, S, Kim, J, Muniyandi, M, Kim, H, and Srinivasan, M A. Haptics in Minimally Invasive Surgical Simulation and Training, IEEE Computer Graphics and Applications, Vol. 24, No. 2, pp. 56-64, 2004.

10. Salisbury, K, Conti, F, and Barbagli, F. Haptic rendering: Introductory concepts,
IEEE Computer Graphics and Applications, Vol. 4, No. 2 pp. 24-32,
2004.

11. Kim,
J,
Kim,
H,
Tay,
B K,
Muniyandi,
M,
Jordan,
J,
Mortensen,
J,
Oliveira,
M,
Slater,
M,
Transatlantic Touch:
A
Study
of
Haptic
Collaboration
over
Long
Distance,
Presence:
Teleoperators
&
Virtual
Environments,
Vol.
13,
No.
3,
pp:
328

337,
2004.

12. Wessberg,
J,
Stambaugh,
C R,
Kralik,
J D,
Beck,
P D,
Laubach,
M,
Chapin,
J K,
Kim,
J,
Biggs,
S J,
Srinivasan,
M A,
and
Nicolelis,
M A L.
Real-time
prediction
of
hand
trajectory
by
ensembles
of
cortical
neurons
in
primates, Nature.
408:361-5, 2000.

Cite this page

Principle of Haptic Perception. (2016, Dec 24). Retrieved from

https://graduateway.com/principle-of-haptic-perception/

Remember! This essay was written by a student

You can get a custom paper by one of our expert writers

Order custom paper Without paying upfront