JOURNAL ARTICLE

Person Re-Identification by Contour Sketch Under Moderate Clothing Change

Qize YangAncong WuWei‐Shi Zheng

Year: 2019 Journal:   IEEE Transactions on Pattern Analysis and Machine Intelligence Vol: 43 (6)Pages: 2029-2046   Publisher: IEEE Computer Society

Abstract

Person re-identification (re-id), the process of matching pedestrian images across different camera views, is an important task in visual surveillance. Substantial development of re-id has recently been observed, and the majority of existing models are largely dependent on color appearance and assume that pedestrians do not change their clothes across camera views. This limitation, however, can be an issue for re-id when tracking a person at different places and at different time if that person (e.g., a criminal suspect) changes his/her clothes, causing most existing methods to fail, since they are heavily relying on color appearance, and thus, they are inclined to match a person to another person wearing similar clothes. In this work, we call the person re-id under clothing change the "cross-clothes person re-id." In particular, we consider the case when a person only changes his clothes moderately as a first attempt at solving this problem based on visible light images; that is, we assume that a person wears clothes of a similar thickness, and thus the shape of a person would not change significantly when the weather does not change substantially within a short period of time. We perform cross-clothes person re-id based on a contour sketch of person image to take advantage of the shape of the human body instead of color information for extracting features that are robust to moderate clothing change. To select/sample more reliable and discriminative curve patterns on a body contour sketch, we introduce a learning-based spatial polar transformation (SPT) layer in the deep neural network to transform contour sketch images for extracting reliable and discriminant convolutional neural network (CNN) features in a polar coordinate space. An angle-specific extractor (ASE) is applied in the following layers to extract more fine-grained discriminant angle-specific features. By varying the sampling range of the SPT, we develop a multistream network for aggregating multi-granularity features to better identify a person. Due to the lack of a large-scale dataset for cross-clothes person re-id, we contribute a new dataset that consists of 33,698 images from 221 identities. Our experiments illustrate the challenges of cross-clothes person re-id and demonstrate the effectiveness of our proposed method.

Keywords:
Clothing Artificial intelligence Sketch Identification (biology) Computer science Computer vision Pattern recognition (psychology) Geography Algorithm

Metrics

246
Cited By
7.91
FWCI (Field Weighted Citation Impact)
88
Refs
0.98
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Face recognition and analysis
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Gait Recognition and Analysis
Physical Sciences →  Engineering →  Biomedical Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.