JOURNAL ARTICLE

Contextual adaptive particle filtering for robust real-time non-rigid object tracking

Abstract

Particle Filtering algorithm for tracking the location of an object using a color distribution is one of the most used algorithm in many sub-field of visual tracking problem. However, the use of a color distribution for tracked object description is insufficient in practice. In this paper, we present an adaptive contextual particle filtering algorithm integrating multiple cues to non-rigid object tracking, designed to handle illumination variation, scale change and complex non-rigid motion. For this purpose, low-level contextual information computed through Haralick texture features and color cues are combined into a model describing the appearance of the target. The likelihood of each cue is calculated and the algorithm rely on likelihood factorization as a product of the likelihoods of the cues. Moving object extraction is performed at each frame for initializing the filter and adapting the search space of each particle with the real dimension of the tracked target. Experimental results of applying this approach show improvement in tracking and robustness in recovering from very complex conditions.

Keywords:
Computer vision Artificial intelligence Particle filter Video tracking Robustness (evolution) Computer science Initialization Tracking (education) Object detection Mean-shift Active appearance model Object (grammar) Pattern recognition (psychology) Filter (signal processing) Image (mathematics)

Metrics

2
Cited By
0.52
FWCI (Field Weighted Citation Impact)
16
Refs
0.70
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Remote-Sensing Image Classification
Physical Sciences →  Engineering →  Media Technology
Remote Sensing and Land Use
Physical Sciences →  Earth and Planetary Sciences →  Atmospheric Science
© 2026 ScienceGate Book Chapters — All rights reserved.