JOURNAL ARTICLE

Moving object segmentation for video surveillance and conferencing applications

Abstract

Video surveillance and conferencing systems have an impressive spread both for their practical application and interest as research issue. The common approach used in such systems consists of a good segmentation of moving objects. This paper presents an algorithm for segmenting and extracting moving objects suitable for surveillance and video conferencing applications, where a still background frame can be captured beforehand. Since edge detection is often used to extract accurate boundaries of the objects in the scene, the first step in our algorithm is accomplished by combining two kinds of edge points which are detected from the frame difference and the background subtraction. After removing edge points that belong to the background frame, the resulting moving edge map is fed to the object extracting step. A fundamental task in this step is to declare the candidates of the moving object, followed by applying morphological closing and opening operations. The algorithm is implemented on real video sequences and good segmentation performance is achieved.

Keywords:
Computer science Computer vision Artificial intelligence Background subtraction Segmentation Frame (networking) Object (grammar) Enhanced Data Rates for GSM Evolution Object detection Image segmentation Videoconferencing Closing (real estate) Task (project management) Video tracking Pixel Multimedia

Metrics

4
Cited By
0.77
FWCI (Field Weighted Citation Impact)
14
Refs
0.75
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Image and Video Retrieval Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.