JOURNAL ARTICLE

Suspicious Object Tracking by Frame Differencing with Backdrop Subtraction

Abstract

In order to prevent terrorism, tracking of objects by stationary surveillance cameras is frequently employed for security in public spaces including railway stations, airports, parking lots, and public transit. Many applications for accurate object detection in visual scenes may be found utilizing various vision algorithms. In this paper, we describe a model for monitoring many objects simultaneously with the identification of unclaimed luggage in a real-time setting. I recreated the backdrop scene from the original frame in this model. After that, we detected and followed moving items like people and parcels using a background-subtracted motion model. The suggested approach additionally records the past positions of moving items and then employs frame differentiation methods to track down past packages and identify any that were dropped by people The suggested system was tested in several indoor and outdoor situations with diverse illumination conditions using the PETS 2006 and PETS2007 datasets as well as. The real-time system was also run on a platform called MATLAB Simulink.

Keywords:
Background subtraction Computer science Frame (networking) Computer vision Artificial intelligence Tracking (education) Object (grammar) Track (disk drive) Object detection MATLAB Identification (biology) Motion (physics) Video tracking Public security Real-time computing Pixel Pattern recognition (psychology) Telecommunications

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
12
Refs
0.08
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Fire Detection and Safety Systems
Physical Sciences →  Engineering →  Safety, Risk, Reliability and Quality
Advanced Measurement and Detection Methods
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.