JOURNAL ARTICLE

Detecting Moving Object Using Background Subtraction Algorithm in FPGA

Abstract

Image and video processing applications tend to be real time constraints. Applications like visual surveillance, traffic monitoring, vehicle tracking, autonomous navigation, computer vision etc. has the basic requirement of identifying moving object in real time. Hardware based approaches are well suited for real time motion detection as they results in high performance and low cost. For rapid development of real time motion detection systems, we propose hardware architecture for motion detection based on the background subtraction algorithm, which is implemented in FPGAs. The steps involved in the process are: (a) a grey level background image is stored in an SRAM memory in FPGA, (b) color reduction is applied to both the background and current images, (c) Both filtered images are then subtracted using image subtraction (d) the gravity center of the object of resultant image is calculated and sent to a PC (via RS-232 interface) (e) Sobel edge detection algorithm is used to identify object's edges. Identifying object's edge could be extended by classifying objects based on their shapes.

Keywords:
Background subtraction Computer vision Computer science Artificial intelligence Object detection Sobel operator Field-programmable gate array Motion detection Video tracking Edge detection Image processing Object (grammar) Image (mathematics) Computer hardware Motion (physics) Pixel Pattern recognition (psychology)

Metrics

16
Cited By
0.96
FWCI (Field Weighted Citation Impact)
8
Refs
0.80
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Fire Detection and Safety Systems
Physical Sciences →  Engineering →  Safety, Risk, Reliability and Quality
Image Enhancement Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.