Northwestern University engineers have developed the first haptic device that achieves “human resolution,” meaning it accurately matches the sensing abilities of the human fingertip.
Called VoxeLite, the ultra-thin, lightweight, flexible, wearable device recreates touch sensations with the same clarity, detail and speed that skin naturally detects. Similar to a bandage, the device gently wraps around a fingertip to give digital touch the same realism people now expect from today’s screens and speakers.
By combining high spatial resolution with a comfortable, wearable form factor, VoxeLite could transform how people interact with digital environments, including more immersive virtual reality systems, assistive technologies for people with vision impairments, human-robot interfaces and enhanced touchscreens.
The study will be published on Wednesday (Nov. 19) in the journal Science Advances.
“Touch is the last major sense without a true digital interface,” said Northwestern’s Sylvia Tan, who led the study. “We have technologies that make things look and sound real. Now, we want to make textures and tactile sensations feel real. Our device is moving the field toward that goal. We also designed it to be comfortable, so people can wear it for long periods of time without needing to remove it to perform other tasks. It’s like how people wear glasses all day and don’t even think about them.”
“This work represents a major scientific breakthrough in the field of haptics by introducing, for the first time, a technology that achieves ‘human resolution,’” said Northwestern’s J. Edward Colgate, a haptics pioneer and senior author of the study. “It has the ability to present haptic information to the skin with both the spatial and temporal resolution of the sensory system.”
Colgate is the Walter P. Murphy Professor of Mechanical Engineering at Northwestern’s McCormick School of Engineering and director of the National Science Foundation Engineering Research Center on Human AugmentatioN via Dexterity (HAND). Colgate and co-senior author Michael Peshkin, the Allen K. and Johnnie Cordell Breed Senior Professor of Design and professor of mechanical engineering at McCormick, are longtime collaborators and pioneers in the field of haptics technology. Tan is a Ph.D. student at Northwestern’s Center for Robotics and Biosystems, where she is advised by Colgate and Peshkin.
Unsolved problems in haptics
Despite decades of progress in high-definition video and true-to-life audio, digital touch has stubbornly lagged behind. Today’s haptic feedback — mostly simple smartphone vibrations — cannot convey the rich, detailed information the fingertips naturally perceive. This is partially because the skin’s spatial and temporal resolution is notoriously difficult to simulate.
“Think of very old motion pictures when the number of frames per second was really low, so movements looked jerky. That’s due to low temporal resolution,” Colgate said. “Or think of early computer displays where images were pixelated. That’s low spatial resolution. Nowadays, both problems are solved for graphical displays. For tactile displays, however, they have been far from solved. In fact, very few researchers have even attempted to tackle both of them together.”
Individual pixels of touch
With VoxeLite, Tan, Colgate and Peshkin bring the field much closer to solving these issues. The device features an array of tiny, individually controlled nodes embedded into a paper-thin, stretchable sheet of latex. These soft nodes function like pixels of touch, each capable of pressing into the skin at high speeds and in precise patterns.
Each node comprises a soft rubber dome, conductive outer layer and hidden inner electrode. When a slight voltage is applied, it generates electroadhesion — the same principle that causes a balloon to stick to a wall after being rubbed. In their previously developed TanvasTouch technology, Colgate and Peshkin harnessed electroadhesion to modulate friction between a fingertip and a smooth touchscreen surface. In those devices, an applied electric field alters friction to create the illusion of texture, but it does not involve any moving parts.
VoxeLite moves this concept forward. The new technology applies electrostatic forces in a precise, controlled way to make each tiny node “grip” a surface and tilt to press into skin. This generates a highly localized mechanical force, so each “pixel” of touch pushes the skin on a fingertip. Higher voltages increase friction during movement, producing more pronounced tactile cues to simulate the feeling of a rough surface. On the other hand, lower voltages create less friction and, therefore, the sensation of a slipperier surface.
“When swiped across an electrically grounded surface, the device controls the friction on each node, leading to controllable indentation on the skin,” Colgate said. “Past attempts to generate haptic effects have been big, unwieldy, complex devices. VoxeLite weighs less than a gram.”
Reaching human resolution
To create the human-resolution sensations, Tan packed the nodes closely together. In the densest version of the device, nodes are spaced about 1 millimeter apart. In user testing, Tan used a version with 1.6 millimeters of spacing among the nodes.
“The density of the nodes really matters for matching human acuity,” Tan said. “The nodes need to be far enough apart that your body can tell them apart. If two nodes are less than one millimeter apart, your fingertips only sense one node instead of two. But if nodes are too far apart, they cannot recreate fine details. To make sensations that feel real, we wanted to match that human acuity.”
VoxeLite operates in two modes: active and passive. In active mode, the device generates virtual tactile sensations by rapidly tilting and indenting individual nodes as a user moves across a smooth surface, such as the screen of a smartphone or tablet. The nodes can move up to 800 times per second, covering nearly the full frequency range of human touch receptors.
Recognizing virtual textures
In a series of experiments, study participants wearing the device accurately and reliably recognized virtual textures, patterns and directional cues. People wearing VoxeLite identified those directions patterns — up, down, left and right — with up to 87% accuracy. They also identified real fabrics, including leather, corduroy and terry cloth, with 81% accuracy.
In passive mode, the device essentially disappears. Because it is extremely thin, soft and conforms to the skin, VoxeLite does not interfere with real-world tasks or block the natural sense of touch. Then, wearers can move seamlessly between real and digital experiences.
For future iterations of the device, the Northwestern team envisions a technology that can be paired with smartphones and tablets. Just like earbuds use Bluetooth to interact with our devices, VoxeLite could someday perhaps sync with devices to transform flat, smooth screens into textured interfaces. That potentially could lead to more lifelike online shopping experiences, where shoppers can feel textiles and fabrics before making a purchase. It also could lead to tactile maps for people with vision impairments or more interactive games, where players can feel the stretch of a rubber band or the bumpy rocks on a cliff.
“What makes this most exciting is combining spatial and temporal resolution with wearability,” Tan said. “People tend to focus on one of these three aspects because each one is such a difficult challenge. Our lab already solved temporal resolution with electroadhesion. Then, my challenge was to make it spatially distributed and wearable. It did take a while to get here. Now, we’re running studies to understand how humans actually receive and perceive this tactile information.”
The study, “Towards human-resolution haptics: a high bandwidth, high density, wearable tactile display,” was supported by the National Science Foundation (award numbers 2106191 and 2330040).
END