Why Haptics, Why Now?

It was nearly 600 million years ago that multicellular organisms first developed the sense of touch, making it the earliest sense to evolve through natural selection. So, most probably touch came first. This highlights the primal importance of touch and its foundational role in survival, shaping the way organisms interact with their environment.

Fast forward half a billion years later, but still only 70 thousand years since the cognitive revolution, and another multicellular organism, the human being, communicates, drives, flies, dives in the deep oceans, explores the universe and goes to other planets, and all of these in two worlds: the physical, and the digital. Yet, most of the tech we develop relies heavily on vision. For example, about 90% of driving decisions are based on visual input. In that task, touch, along with vestibular, kinaesthesia, and hearing, plays only a supporting role in contributing to decision-making.

Despite the dominance of vision in technology, it's well documented that blind individuals can lead functional, fulfilling lives by relying on their other senses, such as touch and hearing, to navigate their environment and perform daily tasks. Similarly, deaf individuals compensate with enhanced visual and tactile senses, enabling them to communicate through sign language and maintain full independence. However, consider the consequences for a person without kinaesthesia and touch, a condition known as deafferentation: their ability to coordinate movements would be severely impaired. It will be amazingly difficult to control their movements based solely on visual information. Without these senses, it would be nearly impossible to control basic actions like walking, grasping objects, or maintaining balance, as demonstrated in cases of patients with somatosensory loss, who are unable to move effectively unless they constantly monitor their limbs with their eyes. An example is the case of Ian Waterman, ‘the man who lost his body’.

It's remarkable, to say the least, that we've developed technologies and devices that largely overlook the vast sensory input provided by the millions of biological sensors in our bodies. These receptors offer crucial information about our environment, yet many modern devices fail to fully or even adequately engage this vital sense.

Our skin contains four primary types of tactile mechanoreceptors, or simply put, touch sensors, each specialised for different aspects of touch: 1. Merkel Cells – These are the most sensitive to sustained pressure and texture. Found in high density in the fingertips, they help us detect fine details like the edges of objects. Merkel cells are essential for tasks requiring precision, such as reading Braille or handling delicate objects. 2. Meissner’s Corpuscles – These sensors respond to light touch and changes in texture. They are especially concentrated in areas like the fingertips and lips, where rapid sensing is necessary, such as when gripping or holding objects. Their sensitivity to rapid changes in touch allows us to detect slipping objects and adjust our grip. 3. Ruffini Endings – These receptors detect skin stretch and contribute to our perception of hand position and the movement of fingers. They provide feedback when our skin is deformed and play a role in the control of movement and posture. 4. Pacinian Corpuscles – These receptors are responsible for sensing deep pressure and vibration. They are sensitive to high-frequency vibrations and play a key role in detecting tools or objects moving in contact with the skin, such as when using a pen or operating machinery. In addition to these skin receptors, the muscle spindles and Golgi tendon organs provide critical feedback from our muscles and tendons. Muscle spindles sense changes in muscle length and rate of stretching, while Golgi tendon organs monitor the force exerted by the muscles. These sensors are crucial for coordinating movement, balance, and proprioception, enabling us to make fine motor adjustments as we interact with the world. Close your eyes and then move your hand to touch the tip of your nose. How can you do that without looking? That’s the muscle spindles and Golgi organs. Another remarkable property of touch is its role in audio-tactile integration, where the auditory cortex is activated by touch stimuli alone. This means that, under certain conditions, we can "hear" with our hands. Research has shown that tactile stimulation can excite areas of the brain typically associated with auditory processing, demonstrating the brain’s capacity for cross-modal sensory integration.

Despite its immense potential, touch is frequently overlooked by modern technology. Beyond driving, where tactile feedback contributes to decision-making, touch plays a crucial role in technologies like XR headsets, laptops, smartphones, and gaming controllers. These devices increasingly depend on haptic feedback to create immersive experiences, yet this aspect is often underdeveloped or limited to basic vibrations. Advanced applications, such as virtual reality, where touch could profoundly enhance realism and immersion, are still in their infancy in terms of incorporating sophisticated haptic feedback. Sometimes, touch and kinaesthesia are so overlooked that they are effectively "removed," even in cases where they are critical. Take, for example, the case of controlling a boat’s paddle with a steering wheel. Initially, a mechanical steering system allowed the captain to feel the hydrodynamic forces acting on the paddle, which was essential for controlling the boat. However, when an electronic servo system was introduced to replace the mechanical steering, the captain could no longer feel these forces, making it difficult to coordinate actions and maintain control. To solve this issue, an electronic haptic system was added to measure the hydrodynamic forces, and a force actuation system was implemented to simulate those forces on the steering wheel.

Surgical robots provide another striking example of the critical role of touch. In minimally invasive surgeries, the forces generated by tissue-tool interactions are incredibly small—so subtle that it would be impossible for a surgeon to control the robot accurately without tactile feedback. The Da Vinci robot, famous for its precision, even demonstrated its delicate control by peeling a grape in a viral video. Without amplified forces through haptic feedback, the surgeon would not be able to feel these tiny forces and coordinate the tools with the precision necessary for such fine work.

The early 2000s saw a flourishing of haptic technology, with several commercial haptic devices reaching the market. One pioneering system was the PHANToM Haptic Device, developed by SensAble Technologies, which allowed users to "feel" virtual objects by providing force feedback to the user’s hand. Another notable product was the Novint Falcon, a consumer-level haptic controller launched in 2007 aimed at gamers and 3D modellers. And many other.

However, despite the momentum of haptic systems in the early 2000s, their integration into VR technology and other applications began to slow. This was largely due to the limitations of hardware and wireless communication at the time, which lacked the ability to meet the demands of low-latency control necessary for effective haptic systems. In haptics, forces must be generated with a refresh rate of at least 500 Hertz, and even more when interactions involve high speeds, as below 500 Hertz we can still feel the vibrations.

But now, things have changed. Wireless communication is 10 times faster than in the early 2000s with Bluetooth 5.0, and 100 times faster with Wi-Fi 6 technology. Actuator control is now handled effectively by inexpensive commercial off-the-shelf electronics, managing tens of outputs simultaneously, with clock speeds up to 40 MHz at 16 bits on platforms like the ESP32 using Pulse-Width-Modulation to control actuators. In addition, actuation and sensing technologies have greatly evolved, with the appearance of shape memory alloys, electro-tactile displays, super-miniaturised DC motors, and the development of streamlined production methods for electronics. Moreover, the rise of machine learning tools adds sophistication to realistic haptic feedback. By analysing massive datasets, machine learning can simulate complex sensations tailored to specific user interactions, enhancing the accuracy and immersion of haptic feedback.

In the coming years, we are set to witness a revolution in immersive experiences driven by the integration of haptic technology across a diverse range of industries. Imagine online shopping where consumers can "touch" and feel the texture of products before purchasing, or healthcare training simulations that replicate the tactile nuances of real-life medical procedures. Even in everyday driving, haptics is already enhancing safety through systems like lane assist. As this technology continues to evolve, it will redefine how we interact with digital and physical environments. For entrepreneurs, investors, and developers, now is the opportune moment to explore the vast potential of haptics. By incorporating this technology into their products and strategies, they can lead the way in creating ground-breaking experiences that tap into the full range of human touch and sensation.

By Evagoras Xydas, Chief Executive Officer, irerobot.com

Read More

Navigating Logistics Challenges: Why Choosing the Right Partner is a Key to Success.
Curiosity and Diversity: The Pillars of Workplace Innovation
Navigating the Final Phase of NIS2 Directive Compliance
The far-reaching impact of the New EU's Corporate Sustainability Due Diligence Directive (CSDDD)
Cyprus Tax Stimuli for Relocating Persons and High Net-Worth Individuals
Navigating the Future of Telecommunications: Key Trends and Strategies
Wills: When Should They be Drawn Up and What Are Some Potential Complications?
One Byte at a Time
Tax Reform for a Prosperous and Equitable Future
ICT Providers in the Time of DORA