JOURNAL ARTICLE

Towards Open-Set Material Recognition using Robot Tactile Sensing

Abstract

The texture recognition can provide clues for robots to interact with the external environment. The traditional tactile material recognition task is studied under the close-set assumption, which means that all types of materials are included in the training set. However, the open-set materials recognition for robots is of much greater significance because in the real-world applications, there is usually something that doesn't belong to any known class. Up to now, there is no researcher to further the discussion of this problem. To cope with unknown classes, this study proposes the Open set Material Recognition (OpenMR) based on General Convolutional Prototype Learning (GCPL). To handle the open space risk for GCPL caused by the lack of unknown samples in the training stage, we use Generative Adversarial Networks (GAN) to synthesize open-set samples as unknowns. The proposed framework is implemented and tested on two batches of tactile data collected in different exploratory motions on 8 material textures using the electronic skin. Compared with other open-set classifiers, experiments reveal that the proposed framework achieves competitive performance in both known classification and unknown detection.

Keywords:
Computer science Set (abstract data type) Robot Open set Artificial intelligence Task (project management) Class (philosophy) Pattern recognition (psychology) Space (punctuation) Convolutional neural network Computer vision Engineering Mathematics

Metrics

1
Cited By
0.29
FWCI (Field Weighted Citation Impact)
34
Refs
0.57
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Industrial Vision Systems and Defect Detection
Physical Sciences →  Engineering →  Industrial and Manufacturing Engineering
Advanced Chemical Sensor Technologies
Physical Sciences →  Engineering →  Biomedical Engineering
Tactile and Sensory Interactions
Life Sciences →  Neuroscience →  Cognitive Neuroscience
© 2026 ScienceGate Book Chapters — All rights reserved.