Vision-based tactile sensors (VBTS) leverages visual modality to present high-resolution tactile information. The vision-based sensing mechanism has good adaptability to robot manipulation because of available information capture and data processing. However, the simulation of VBTS is a challenging problem because it deals with computer graphics and elasto-plastic deformation. In this paper, we propose a simulation approach for VBTS using a rendering approach based on path tracing. The rendering method can be integrated with existing VBTS and robotic simulators for the simulation of robotic systems. In addition, this method can control the rendering efficiency and quality by controlling the number of voxels in the fitting deformation area to meet the requirements of efficient robot training. The experimental results verify that our method can provide high-quality images at low rendering efficiency or reduce image quality to improve rendering efficiency. Our work has the potential to advance the sim-to-real research on VBTS.
Arpit AgarwalTimothy ManWenzhen Yuan
Arpit AgarwalAchu WilsonTimothy ManEdward H. AdelsonIoannis GkioulekasWenzhen Yuan
Haoran LiYijiong LinChenghua LuMax YangEfi PsomopoulouNathan F. Lepora