JOURNAL ARTICLE

Dynamic Background Modeling for Foreground Segmentation

Abstract

This paper presents a dynamic background modeling approach for foreground segmentation. The classification between foreground and background is based on Bayes decision rule. The posterior probability of a pixel being observed as a background or a foreground is directly estimated based on the occurrence frequency of its quantized version. Experimental results show that the presented method can be performed in real time and has good performance in complex and dynamic environments.

Keywords:
Foreground detection Segmentation Artificial intelligence Computer science Computer vision Pixel Image segmentation Background subtraction Pattern recognition (psychology) Bayes' theorem Posterior probability Bayesian probability

Metrics

3
Cited By
0.31
FWCI (Field Weighted Citation Impact)
8
Refs
0.68
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Remote Sensing and Land Use
Physical Sciences →  Earth and Planetary Sciences →  Atmospheric Science

Related Documents

JOURNAL ARTICLE

Foreground Object Segmentation in Dynamic Background Scenarios

Tomasz Kryjak

Journal:   Image Processing & Communications Year: 2014 Vol: 19 (2-3)Pages: 25-36
JOURNAL ARTICLE

An Adaptive Background Modeling Method for Foreground Segmentation

Zuofeng ZhongBob ZhangGuangming LuYong ZhaoYong Xu

Journal:   IEEE Transactions on Intelligent Transportation Systems Year: 2016 Vol: 18 (5)Pages: 1109-1121
JOURNAL ARTICLE

Robust Dynamic Background Modeling for Foreground Estimation

Ning QianFangfang WuWeisheng DongJinjian WuGuangming ShiXin Li

Journal:   2022 IEEE International Conference on Visual Communications and Image Processing (VCIP) Year: 2022 Vol: 34 Pages: 1-5
© 2026 ScienceGate Book Chapters — All rights reserved.