JOURNAL ARTICLE

Event clustering of consumer pictures using foreground/background segmentation

Abstract

This paper describes a new algorithm to classify consumer photographs into different events when date and time information is not available. Without any information about the context of the pictures, we have to rely on the image content. Our approach involves using an efficient segmentation scheme and extraction of low-level features to detect event boundaries. Specifically, we have developed a foreground/background segmentation algorithm based on block-based clustering. This block segmentation provides less precision, but still gives good results with low computation cost. A third-party ground truth database has been created with the help of the Human Factors Laboratory at Kodak, to benchmark our approaches. Based on these results, we concluded that a simple block-based segmentation scheme performed better than the original block-based event clustering algorithm without segmentation. We believe that many improvements, especially on segmentation and feature extraction, should lead to better results in the future.

Keywords:
Segmentation Computer science Cluster analysis Artificial intelligence Segmentation-based object categorization Benchmark (surveying) Image segmentation Block (permutation group theory) Event (particle physics) Scale-space segmentation Pattern recognition (psychology) Context (archaeology) Ground truth Feature extraction Computer vision Mathematics Geography

Metrics

2
Cited By
0.00
FWCI (Field Weighted Citation Impact)
11
Refs
0.20
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Image Retrieval and Classification Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Advanced Image and Video Retrieval Techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Video Analysis and Summarization
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.