Medicine Technology 🌱 Environment Space Energy Physics Engineering Social Science Earth Science Science
Technology 2026-03-04 3 min read

Robots Could Soon Feel Like Elephant Trunks and Octopus Tentacles

A new physics-based simulator called SimTac solves a longstanding bottleneck in robotics by accurately modeling touch sensors shaped like biological structures - at up to 250 frames per second.

The human fingertip contains roughly 2,500 touch receptors per square centimeter. When you pick up a grape without crushing it, those receptors are feeding continuous information about pressure, texture, and deformation to your nervous system, which is making thousands of micro-adjustments per second. Robots, by comparison, are largely numb.

Vision-based tactile sensors - camera-based systems embedded in a robot's fingertip that detect contact through light patterns - have made genuine progress. But they are almost universally flat. Real biological touch structures are not. Cat paws curve. Elephant trunks flex in three dimensions. Octopus suckers conform to irregular shapes. Designing sensors that match these geometries has been stalled by a fundamental problem: there was no good way to simulate how such sensors would behave before building them.

Why simulation matters so much in robotics

Building tactile sensors is expensive and time-consuming. Every design iteration requires fabrication, testing, and data collection. And training the machine learning algorithms that make sense of sensor data requires enormous amounts of labeled examples - which, in real-world conditions, means hours of human-supervised interaction with real objects, risking sensor wear each time.

The alternative is simulation: build a virtual version of the sensor, generate training data synthetically, and transfer the learned behavior to the real hardware. This works well for flat sensors. For complex, biologically-shaped sensors, it has barely worked at all.

SimTac, developed by Xuyang Zhang and colleagues at King's College London and published in Cyborg and Bionic Systems, is designed to close that gap.

Three modules working in concert

The simulator combines three interlocking components. A particle-based deformation module uses the Material Point Method to calculate how a soft sensor membrane deforms when it contacts an object - handling the complex, irregular geometries that trip up simpler approaches. A light-field optical rendering module then translates those deformations into realistic camera images, mimicking the way light behaves inside the sensor. A third module, built around a Sparse Tensor Network, predicts the mechanical forces involved at speeds that make real-time operation possible.

The accuracy numbers are notable. Mean absolute error in deformation field prediction reached 2.77 x 10^-4 mm; force field prediction error was 8.6 x 10^-6 N; and total force prediction error in the normal direction was just 6.27 percent. On a GPU, the three modules run at frame rates of 250, 100, and 100 FPS respectively - fast enough to use in real-time control loops.

From octopus tentacles to elephant trunks

The team validated SimTac on sensors shaped like octopus tentacles and cat paws, then went further: they designed and physically built an elephant trunk-shaped sensor prototype based purely on simulation outputs. That prototype was then tested on three practical tasks - object classification, slip detection, and contact safety assessment - without any additional real-world training. The zero-shot transfer accuracy rates were 97.0 and 92.06 percent for the first two tasks, and the safety assessment error was 0.105.

Zero-shot transfer - the ability to move directly from simulation to physical hardware without additional calibration - is the hardest test for any simulator. These numbers are competitive with results from systems designed for flat sensors that have had years of development.

What still needs solving

The system has a real limitation. Training the neural network requires ground-truth data generated by Finite Element Method simulations, and collecting high-resolution FEM data for a sensor with an entirely new shape takes several days - even with GPU acceleration and offline processing. This is not a deal-breaker for a research context, but it would need to be addressed before the tool could be used routinely in industrial design cycles.

"Future research will focus on optimizing data collection efficiency, while expanding the simulator's application in actuator simulation and complex dynamic contact scenarios," said Zhang.

For now, SimTac represents a meaningful advance in the toolkit available to robotics researchers who want to build machines that can actually feel what they are touching - in shapes that go well beyond the flat rectangle.

Source: Xuyang Zhang et al., "SimTac: A Physics-Based Simulator for Vision-Based Tactile Sensing with Biomorphic Structures," Cyborg and Bionic Systems (February 24, 2026). DOI: 10.34133/cbsystems.0510. Supported by EPSRC project EP/T033517/2. Media contact: Ning Xu, xuning1907@foxmail.com.