The new wearable tactile rendering system can mimic touch sensations with high spatial resolution and a rapid response rate. (Image: Robotics X Lab and City University of Hong Kong)

The collaborative research team’s — co-led by City University of Hong Kong (CityU) and Chinese tech company Tencent’s Robotics X Laboratory — system aims to add the sense of touch to the metaverse for use in virtual-reality shopping and gaming, and potentially facilitating the work of astronauts and other professions that require the use of thick gloves.

“We can hear and see our families over a long distance via phones and cameras, but we still cannot feel or hug them. We are physically isolated by space and time, especially during this long-lasting pandemic,” said Dr. Yang Zhengbao, Associate Professor, Department of Mechanical Engineering, CityU. “Although there has been great progress in developing sensors that digitally capture tactile features with high resolution and high sensitivity, we still lack a system that can effectively virtualize the sense of touch and record and play back tactile sensations over space and time.”

Existing tactile stimuli reproduction techniques — mechanical and electrical stimulation — are either bulky, limiting the spatial resolution when integrated into a portable or wearable device, or rely on high-voltage direct-current (DC) pulses (up to hundreds of volts).

By contrast, this latest electro-tactile actuator is very thin and flexible, and can be easily integrated into a finger cot. The fingertip wearable device displays different tactile sensations in high fidelity (e.g., pressure, vibration, and texture roughness). Plus, a high-frequency alternating current stimulation strategy and lower voltage, under 30 V, replaces DC pulses.

“Our new system can elicit tactile stimuli with both high spatial resolution (76 dots/cm2), similar to the density of related receptors in the human skin, and a rapid response rate (4 kHz),” said Lin Weikang, a CityU PhD student who made and tested the device.

The team also proposed a new Braille strategy that breaks down the alphabet and numerical digits into individual strokes ordered in the same manner they’re written.

“This would be particularly useful for people who lose their eyesight later in life, allowing them to continue to read and write using the same alphabetic system they are used to, without the need to learn the whole Braille dot system,” said Yang.

The new system is also well-suited for VR/AR applications and games — bringing touch to the metaverse. The team demonstrated that a user can virtually sense the texture of clothes in a virtual fashion shop. In addition, the user also gets an itchy sensation when being licked by a VR cat.

Lastly, the team integrated the thin, light electrodes of the electrotactile rendering system into flexible tactile sensors on safety gloves. The tactile sensor array captures the pressure distribution on the glove’s exterior and relays the information to the user in real time through tactile stimulation.

“We expect our technology to benefit a broad spectrum of applications, such as information transmission, surgical training, teleoperation, and multimedia entertainment,” said Yang.

Here is a Tech Briefs interview with Zhengbao, edited for clarity.

Tech Briefs: What inspired the research?

Zhengbao: Frankly speaking, there is no Eureka moment for this research. Our lab and Tencent collaborators have worked on tactile sensors and stimulators for a long time. Why did we start this? I think it was mainly motivated by a real need in the industry.

Tencent is a global giant in the gaming industry and VR/AR technology; the COVID pandemic has tortured our human race for three years with lonely quarantines and societal shutdown. All these have motivated us to develop tactile rendering systems to better enjoy video games and maybe ‘hug’ our dear families during quarantine.

In addition, during the R&D, China's human spaceflight was on the front page of newspapers. We saw our astronauts wearing thick bulky space suits. They cannot perform tasks that are very easy on earth using our bare hands. So, we were thinking of helping them, and maybe also firefighters, to feel the outside indirectly over thick gloves.

Tech Briefs: What were the biggest technical challenges?

Zhengbao: Existing techniques to reproduce tactile stimuli can be broadly classified into two categories, mechanical or electrical stimulation. By applying localized mechanical force or vibration on the skin, mechanical actuators can elicit stable and continuous tactile sensations. However, these mechanical actuators tend to be bulky, severely limiting the spatial resolution when integrated into a portable or wearable device.

Even though electrotactile stimulators can be light and flexible, while offering higher resolution and faster response, the voltage applied to human skin is higher than hundreds of volts. This high voltage penetrates the stratum corneum layer to stimulate mechanoreceptors and nerves, which poses a safety concern.

Tech Briefs: Can you explain in simple terms how your technology works?

Zhengbao: First, we need to know why we can feel a tactile perception. When an external force deforms the skin, mechanosensitive ion channels are opened, depolarizing the soma of mechanoreceptors and thus triggering action potentials propagated to the somatosensory cortex through peripheral nerve bundles. So, there is no doubt that we can use electrical stimulation to mimic natural human tactile perception. But we need to overcome the high operating voltage and further improve the tactile rendering resolution simultaneously.

Tech Briefs: What’s the next step with regards to your research/testing?

Zhengbao: A common weakness of electrical-based stimulators is the inability to precisely stimulate just the SA mechanoreceptors without activating the FA receptors, making it challenging to produce the sensation of sustained pressure. In our future work, we want to overcome this challenge. (Commonly speaking, SA mechanoreceptors detect sustained pressure while FA mechanoreceptors respond to the onset and offset of stimulation.)

Tech Briefs: How far away are we from the wearable tactile rendering system becoming available to the average person? Completely ubiquitous?

Zhengbao: It is hard to say, maybe five years. It also depends on other related technology developments, such as the metaverse. We believe this technology will attract the attention of academia and industry, accelerating the marketing process.

Tech Briefs: Anything else you’d like to add?

Zhengbao: We demonstrated its application potential as a braille display, adding the sense of touch in the metaverse, such as virtual reality shopping and gaming, and facilitating the work of astronauts, deep-sea divers, or others who need to wear thick gloves.

Apart from VR/AR applications, we also want to highlight the application of braille displays. One major problem the visually impaired face when learning braille is the disconnect between the systems used for reading and writing. We propose a new braille strategy to enable the visually impaired to use the same alphabetical system to read and write.

We use the tactile continuity illusion, whereby sensory inputs distinct in space and time are naturally pieced together by our somatosensory system to form a continuous sensation. We break down the letters of the alphabet and numerical digits into individual strokes and order them the way they are written.