JOURNAL ARTICLE

Simulation of Vision-based Tactile Sensors with Efficiency-tunable Rendering

Abstract

Vision-based tactile sensors (VBTS) leverages visual modality to present high-resolution tactile information. The vision-based sensing mechanism has good adaptability to robot manipulation because of available information capture and data processing. However, the simulation of VBTS is a challenging problem because it deals with computer graphics and elasto-plastic deformation. In this paper, we propose a simulation approach for VBTS using a rendering approach based on path tracing. The rendering method can be integrated with existing VBTS and robotic simulators for the simulation of robotic systems. In addition, this method can control the rendering efficiency and quality by controlling the number of voxels in the fitting deformation area to meet the requirements of efficient robot training. The experimental results verify that our method can provide high-quality images at low rendering efficiency or reduce image quality to improve rendering efficiency. Our work has the potential to advance the sim-to-real research on VBTS.

Keywords:
Rendering (computer graphics) Computer science Computer vision Artificial intelligence Path tracing Robot Image-based modeling and rendering Computer graphics (images)

Metrics

1
Cited By
0.16
FWCI (Field Weighted Citation Impact)
23
Refs
0.44
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Sensor and Energy Harvesting Materials
Physical Sciences →  Engineering →  Biomedical Engineering
Tactile and Sensory Interactions
Life Sciences →  Neuroscience →  Cognitive Neuroscience
Robot Manipulation and Learning
Physical Sciences →  Engineering →  Control and Systems Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.