Abstract

Recently, an algorithm called compressive tracking is utilized to cope with real time tracking, which uses a very sparse measurement matrix to compress samples of target and background, and then a classifier is trained to distinguish foreground and background. However, compressive tracking extracts foreground features that are limited to the current frame and does not consider the samples of previous frames during its learning procedure. This paper presents a novel enhanced algorithm, which extracts two kinds of samples of targets: current samples and previous samples. Our method use targets information from the previous and current frames to train a novel classifier which composes of two weighted sub-classifiers based on different sample bags. Experimental results demonstrate the effectiveness and robustness of our method.

Keywords:
Computer science Artificial intelligence Robustness (evolution) Classifier (UML) Compressed sensing Pattern recognition (psychology) Computer vision Tracking (education)

Metrics

7
Cited By
1.30
FWCI (Field Weighted Citation Impact)
22
Refs
0.85
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Gaze Tracking and Assistive Technology
Physical Sciences →  Computer Science →  Human-Computer Interaction
Image Enhancement Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

BOOK-CHAPTER

Real-Time Compressive Tracking

Kaihua ZhangLei ZhangMing–Hsuan Yang

Lecture notes in computer science Year: 2012 Pages: 864-877
JOURNAL ARTICLE

Adaptive weighted real‐time compressive tracking

Jian-zhang ZhuYue MaQianqing QinZheng ChenYijun Hu

Journal:   IET Computer Vision Year: 2014 Vol: 8 (6)Pages: 740-752
JOURNAL ARTICLE

Real-time multi-scale parallel compressive tracking

Chi‐Yi TsaiYen-Chang Feng

Journal:   Journal of Real-Time Image Processing Year: 2017 Vol: 16 (6)Pages: 2073-2091
© 2026 ScienceGate Book Chapters — All rights reserved.