JOURNAL ARTICLE

A Wilcoxon Norm Based Robust Machine Learning Approach for Traffic Noise Prediction

Abstract

The preliminary objective of this present research work is to construct an empirical traffic noise prediction model for evaluation of equivalent noise level (Leq) in terms of equivalent traffic volume number under heterogeneous traffic flow. For this research work, commercial road networks are preferred for monitoring and modeling. This proposed system introduces a novel method of robust application of wilcoxon norm based machine learning approach (WNN) for traffic noise prediction. The proposed WNN is designed by assuming that training samples used contains strong outliers (high percentage of data corrupt) and the cost function select is a robust norm called Wilcoxon norm. With the presence of outlier most of all computational intelligence models are failure to predict output. In this paper, it is highlights how Wilcoxon norm based artificial neural network model(WNN) has best performance with the presence of outlier compare to conventional multilayer perceptron neural network. For validation, traffic noise problem is consider as a system identification problem at here. From the simulation study it is found that Wilcoxon norm based artificial neural network model has best performance with the presence of outlier

Keywords:
Wilcoxon signed-rank test Outlier Computer science Artificial neural network Artificial intelligence Noise (video) Machine learning Norm (philosophy) Data mining Mathematics Statistics

Metrics

4
Cited By
0.00
FWCI (Field Weighted Citation Impact)
12
Refs
0.22
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Image and Signal Denoising Methods
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Control Systems and Identification
Physical Sciences →  Engineering →  Control and Systems Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.