JOURNAL ARTICLE

Visual tracking via robust multi-task multi-feature joint sparse representation

Abstract

In this paper, we cast tracking as a novel multi-task learning problem and exploit various types of visual features. We use an on-line feature selection mechanism based on the two-class variance ratio measure, applied to log likelihood distributions computed with respect to a given feature from samples of object and background pixels. The proposed method is integrated in a particle filtering framework. We jointly consider the underlying relationship across different particles, and tackle it in a unified robust multi-task formulation. We show that the proposed formulation can be efficiently solved using the Alternating Direction Method of Multipliers (ADMM) with a small number of closed-form updates. Both the qualitative and quantitative results demonstrate the superior performance of the proposed approach compared to several state of-the-art trackers.

Keywords:
Computer science Artificial intelligence Feature (linguistics) Pattern recognition (psychology) Particle filter Representation (politics) BitTorrent tracker Pixel Variance (accounting) Video tracking Task (project management) Eye tracking Feature selection Exploit Feature extraction Measure (data warehouse) Class (philosophy) Object (grammar) Data mining

Metrics

3
Cited By
0.50
FWCI (Field Weighted Citation Impact)
29
Refs
0.73
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Image Enhancement Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.