Biological tactile perception is closely intertwined with morphological structures. Complex biological structures such as human fingers, cat paws, and elephant trunks endow organisms with rich environmental interaction capabilities. However, existing vision-based tactile sensors in robotics are mostly limited to simple planar geometries, and biomorphic design remains underexplored. Traditional tactile sensors suffer from insufficient shape adaptability and limited precision in capturing intricate contact details. Developing vision-based tactile sensors with biomorphic forms through trial-and-error and hardware iteration faces numerous challenges, such as difficulties in deformation modeling under complex morphologies and complex integration of key components. “Meanwhile, learning-based robotic perception and control methods require large-scale data support, but data collection in real scenarios is costly, time-consuming, and prone to sensor wear. Existing tactile simulators are mostly confined to flat sensors, struggling to adapt to the simulation needs of biomorphic sensors and failing to effectively address the aforementioned bottlenecks.” said the author Xuyang Zhang, a researcher at King’s College London, “Therefore, we propose SimTac, a physics-based simulator, aiming to fill the technical gap in the design and validation of biomorphic vision-based tactile sensors and expand the design space of tactile sensors.”
SimTac is a physics-based simulator for biomorphic vision-based tactile sensors, consisting mainly of three core modules: a particle-based deformation simulation module, a light-field optical rendering module, and a neural network-based mechanical response prediction module, with a structure balancing flexibility and functionality. The particle-based deformation simulation module discretizes the sensor membrane and contacting objects through uniform sampling, uses the Material Point Method (MPM) to iteratively calculate contact deformation, and outputs valid data after postprocessing such as occluded particle removal, camera projection, and depth map interpolation. The light-field optical rendering module first generates linear and nonlinear light fields offline, then achieves real-time image rendering combined with the Phong lighting model, and superimposes the background and foreground to obtain high-fidelity tactile images. The mechanical response prediction module is centered on a Sparse Tensor Network (STN), which maps the deformation data output by MPM to dense force/deformation fields with FEM-level precision for fast and accurate prediction. Its working principle is as follows: after inputting sensor shape, marker pattern, optical system parameters and material properties, the particle-based simulation restores the membrane deformation behavior, the light-field rendering generates photorealistic tactile images, and the neural network outputs mechanical responses. The three modules work synergistically to realize accurate and efficient simulation of biomorphic tactile sensors, supporting zero-shot sim-to-real transfer.
Comprehensive verification of the SimTac simulator was conducted from four aspects: accuracy, efficiency, flexibility, and applicability, with the core results as follows: In terms of accuracy, the tactile images generated by optical response simulation have high similarity to real images, with excellent performance in metrics such as SSIM and PSNR, accurately reproducing details such as contact deformation and reflective light distribution. In mechanical response simulation, the MAEs of the deformation field and force field on the test set are as low as 2.77×10⁻⁴ mm and 8.6×10⁻⁶ N respectively, and the total force prediction error is only 6.27% in the normal direction. Regarding efficiency, when deployed on a GPU, the particle-based deformation simulation, optical rendering, and mechanical prediction modules can all meet real-time requirements, with the highest frame rates reaching 250, 100, and 100 FPS respectively. In terms of flexibility, the simulator can adapt to various biomorphic sensors such as octopus tentacles and cat paws, supports different optical configurations and material parameter adjustments, and can adapt to sensor membranes of different stiffness (soft, medium, hard) by fine-tuning the pre-trained model. For applicability, the elephant trunk-shaped sensor prototype designed based on SimTac was successfully materialized, and performed excellently in three Sim2Real tasks: object classification, slip detection, and contact safety assessment. The zero-shot transfer accuracy rates reached 97.0% and 92.06% respectively, and the MAE of contact safety assessment was 0.105, verifying its practical value in sensor prototype design and robotic tactile perception tasks.
The SimTac simulator proposed in this study achieves accurate and efficient simulation of biomorphic vision-based tactile sensors through the collaborative design of particle-based deformation modeling, light-field rendering, and neural network-based mechanical prediction, exhibiting excellent performance in optical and mechanical response accuracy, adaptability to diverse morphologies and materials, and simulation efficiency. This simulator successfully fills the technical gap in the design and validation of biomorphic tactile sensors, not only supporting the development and materialization of sensor prototypes with various biomorphic structures such as octopus tentacles and elephant trunks but also realizing efficient zero-shot sim-to-real transfer in three core tactile tasks: object classification, slip detection, and contact safety assessment. It provides key technical support for robots to perceive object properties, enhance environmental interaction, and achieve self-protection. Currently, SimTac still has limitations: the neural network training relies on Finite Element Method (FEM) ground truth data, and the collection of high mesh density data for sensors with entirely new morphologies takes several days (though it can be completed offline with GPU acceleration). “Future research will focus on optimizing data collection efficiency, while expanding the simulator's application in actuator simulation and complex dynamic contact scenarios, further promoting the development of biomorphic tactile sensing technology and adaptive robotic systems.” said Xuyang Zhang.
Authors of the paper include Xuyang Zhang, Jiaqi Jiang, Zhuo Chen, Yongqiang Zhao, Tianqi Yang, Daniel Fernandes Gomes, Jianan Wang, and Shan Luo.
This work was supported by the EPSRC project “ViTac: Visual-tactile synergy for handling flexible materials” (EP/T033517/2).
The paper, “SimTac: A Physics-Based Simulator for Vision-Based Tactile Sensing with Biomorphic Structures” was published in the journal Cyborg and Bionic Systems on Feb 24, 2026, at spj.science.org/doi/10.34133/cbsystems.0510.
END