Quan Khanh LuuNhan Huu NguyenVan Anh Ho
Large-scale robotic skin with tactile sensing ability is emerging with the potential for use in close-contact human–robot systems. Although recent developments in vision-based tactile sensing and related learning methods are promising, they have been mostly designed for small-scale use, such as by fingers and hands, in manipulation tasks. Moreover, learning perception for such tactile devices demands a huge tactile dataset, which complicates the data collection process. To address this, this study introduces a multiphysics simulation pipeline, called SimTacLS , which considers not only the mechanical properties of external physical contact but also the realistic rendering of tactile images in a simulation environment. The system utilizes the obtained simulation dataset, including virtual images and skin deformation, to train a tactile deep neural network to extract high-level tactile information. Moreover, we adopt a generative network to minimize sim2real inaccuracy, preserving the simulation-based tactile sensing performance. Last but not least, we showcase this sim2real sensing method for our large-scale tactile sensor ( TacLink ) by demonstrating its use in two trial cases, namely, whole-arm nonprehensile manipulation and intuitive motion guidance, using a custom-built tactile robot arm integrated with TacLink. This article opens new possibilities in the learning of transferable tactile-driven robotics tasks from virtual worlds to actual scenarios without compromising accuracy.
Carmelo SferrazzaRaffaello D’Andrea
Oliver KroemerChristoph H. LampertJan Peters
Mohammad Amin MirzaeeHung-Jui HuangWenzhen Yuan
Tao ZhangYang CongXiaomao LiYan Peng