JOURNAL ARTICLE

Unsupervised Domain-Adaptive Person Re-Identification with Multi-Camera Constraints

Shun TakeuchiFei LiSho IwasakiJiaqi NingGenta Suzuki

Year: 2022 Journal:   2022 IEEE International Conference on Image Processing (ICIP) Vol: 96 Pages: 1636-1640

Abstract

Person re-identification is a key technology for analyzing video-based human behavior; however, its application is still challenging in practical situations due to the performance degradation for domains different from those in the training data. Here, we propose an environment-constrained adaptive network for reducing the domain gap. This network refines pseudo-labels estimated via a self-training scheme by imposing multi-camera constraints. The proposed method incorporates person-pair information without person identity labels obtained from the environment into the model training. In addition, we develop a method that appropriately selects a person from the pair that contributes to the performance improvement. We evaluate the performance of the network using public and private datasets and confirm the performance surpasses state-of-the-art methods in domains with overlapping camera views. To the best of our knowledge, this is the first study on domain-adaptive learning with multi-camera constraints that can be obtained in real environments.

Keywords:
Computer science Domain (mathematical analysis) Artificial intelligence Identification (biology) Key (lock) Identity (music) Computer vision Scheme (mathematics) Machine learning

Metrics

1
Cited By
0.07
FWCI (Field Weighted Citation Impact)
27
Refs
0.29
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Human Pose and Action Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Gait Recognition and Analysis
Physical Sciences →  Engineering →  Biomedical Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.