JOURNAL ARTICLE

Transformer-based Contrastive Learning for Unsupervised Person Re-Identification

Abstract

Unsupervised Re-identification (Re-ID) methods have been dominated by convolutional neural networks (CNN) for many years. Most of these current methods apply pseudo-label-based contrastive learning (CL) and achieve great progress. However, they have limited capacity to represent global fea-tures, suffer from severe performance drops when training with limited computing resources, and are unable to effectively use pseudo-label information when training with CL. To tackle these problems, we propose a Transformer-based Contrastive Learning (TransCL) method to enhance the performance of CL and improve the feature representation ability of Re-ID, in which a batch and memory contrast (BMC) strategy is developed to optimize multi-level CL tasks concurrently to fully use the pseudo-label information. Additionally, a GCN aggregated clustering (GAC) scheme is designed to assist in generating more effective pseudo labels for CL. Extensive experimental results indicate that GAC and BMC work with vision transformer (ViT) achieves better training performance and enhances the representation ability of the Re-ID model. TransCL surpasses the state-of-the-art CNN method by 8.0% in mAP on the challenging MSMT17 dataset.

Keywords:
Computer science Feature learning Transformer Cluster analysis Artificial intelligence Convolutional neural network Machine learning Unsupervised learning Pattern recognition (psychology) Engineering

Metrics

5
Cited By
0.35
FWCI (Field Weighted Citation Impact)
65
Refs
0.65
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Human Pose and Action Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.