JOURNAL ARTICLE

Foreground Segmentation via Background Modeling on Riemannian Manifolds

Abstract

Statistical modeling in color space is a widely used approach for background modeling to foreground segmentation. Nevertheless, sometimes computing such statistics directly on image values is not enough to achieve a good discrimination. Thus the image may be converted into a more information rich form, such as a tensor field, in which can be encoded color and gradients. In this paper, we exploit the theoretically well-founded differential geometrical properties of the Riemannian manifold where tensors lie. We propose a novel and efficient approach for foreground segmentation on tensor field based on data modeling by means of Gaussians mixtures (GMM) directly in the tensor domain. We introduced a Expectation Maximization (EM) algorithm to estimate the mixture parameters, and are proposed two algorithms based on an online K-means approximation of EM, in order to speed up the process. Theoretic analysis and experimental evaluations demonstrate the promise and effectiveness of the proposed framework.

Keywords:
Tensor (intrinsic definition) Computer science Image segmentation Artificial intelligence Manifold (fluid mechanics) Mixture model Riemannian manifold Segmentation Pattern recognition (psychology) Expectation–maximization algorithm Information geometry Field (mathematics) Domain (mathematical analysis) Computer vision Mathematics Maximum likelihood Mathematical analysis Pure mathematics Statistics Geometry Curvature

Metrics

11
Cited By
2.56
FWCI (Field Weighted Citation Impact)
17
Refs
0.90
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Video Surveillance and Tracking Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Remote-Sensing Image Classification
Physical Sciences →  Engineering →  Media Technology
Advanced Image and Video Retrieval Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.