JOURNAL ARTICLE

Robust particle PHD filter with sparse representation for multi-target tracking

Abstract

Recently, sparse representation has been widely used in computer vision and visual tracking applications, including face recognition and object tracking. In this paper, we propose a novel robust multi-target tracking method by applying sparse representation in a particle probability hypothesis density (PHD) filter framework. We employ the dictionary learning method and principle component analysis (PCA) to train a static appearance model offline with sufficient training data. This pre-trained dictionary contains both colour histogram and oriented gradient histogram (HOG) features based on foreground target appearances. The tracker combines the pre-trained dictionary and sparse coding to discriminate the tracked target from background clutter. The sparse coefficients solved by ℓ 1 -minimization are employed to generate the likelihood function values, which are further applied in the update step of the proposed particle PHD filter. The proposed particle PHD filter is validated on two video sequences from publicly available CAVIAR and PETS2009 datasets, and demonstrates improved tracking performance in comparison with the traditional particle PHD filter.

Keywords:
Artificial intelligence Particle filter Clutter Computer science Neural coding Histogram Sparse approximation Pattern recognition (psychology) Computer vision Eye tracking Tracking (education) K-SVD Active appearance model Video tracking Filter (signal processing) Object (grammar) Image (mathematics)

Metrics

5
Cited By
0.84
FWCI (Field Weighted Citation Impact)
28
Refs
0.83
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Remote-Sensing Image Classification
Physical Sciences →  Engineering →  Media Technology
Gait Recognition and Analysis
Physical Sciences →  Engineering →  Biomedical Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.