JOURNAL ARTICLE

Graph-Based Point Cloud Color Denoising with 3-Dimensional Patch-Based Similarity

Abstract

Point clouds are utilized in many 3-D applications such as cross-reality (XR) and realistic 3-D display. They consist of a set of points with 3-D coordinates and associated color signals. These color signals are often perturbed by noise induced by the measurement errors of scanning devices. In this paper, we propose a point cloud denoising method for color signals. Since many conventional methods for point cloud color denoising are based on a low-pass filter in the graph spectral domain, denoising accuracy is affected by the choice of graph. We propose a graph construction method using 3-D patch-based similarity, in which the similarity is calculated with small 3-D patches around the connected points. This is in contrast with conventional graph construction methods for denoising, which are based on point properties such as pairwise point distances and differences in color. Second, we propose a low-pass filtering method where the frequency response is chosen automatically depending on the estimated noise level. Our experimental results show that our proposed method, 3-D patch-based similarity (3DPBS), achieves the best denoising accuracy compared with graph-based state-of-the-art methods.

Keywords:
Noise reduction Point cloud Graph Computer science Artificial intelligence Similarity (geometry) Filter (signal processing) Pattern recognition (psychology) Computer vision Algorithm Theoretical computer science

Metrics

10
Cited By
1.82
FWCI (Field Weighted Citation Impact)
28
Refs
0.82
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Vision and Imaging
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
3D Shape Modeling and Analysis
Physical Sciences →  Engineering →  Computational Mechanics
Optical measurement and interference techniques
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.