JOURNAL ARTICLE

L_p_ Approximation by ReLU Neural Networks

Eman Samir BhayaZainab Abdulmunim Sharba

Year: 2020 Journal:   Karbala International Journal of Modern Science Vol: 6 (4)

Abstract

We know that we can use the neural networks for the approximation of functions for many types of activation functions. Here, we treat only neural networks with simple and particular activation function called rectified linear units (ReLU). The main aim of this paper is to introduce a type of constructive universal approximation theorem and estimate the error of the universal approximation. We will obtain optimal approximation if we have a basis independent of the target function. We prove a type of Debao Chen's theorem for approximation.

Keywords:
Function approximation Approximation error Artificial neural network Constructive Simple (philosophy) Linear approximation Approximation theory Approximation algorithm Activation function Type (biology) Function (biology) Mathematics Computer science Applied mathematics Algorithm Artificial intelligence Mathematical analysis Nonlinear system Physics

Metrics

4
Cited By
0.59
FWCI (Field Weighted Citation Impact)
24
Refs
0.74
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Fuzzy Logic and Control Systems
Physical Sciences →  Computer Science →  Artificial Intelligence
Model Reduction and Neural Networks
Physical Sciences →  Physics and Astronomy →  Statistical and Nonlinear Physics

Related Documents

JOURNAL ARTICLE

Approximation capabilities of deep ReLU neural networks

Elbrächter, Dennis Maximillian

Journal:   University of Vienna Year: 2021
JOURNAL ARTICLE

Optimal function approximation with ReLU neural networks

Бо ЛюYi Liang

Journal:   Neurocomputing Year: 2021 Vol: 435 Pages: 216-227
JOURNAL ARTICLE

Approximation of compositional functions with ReLU neural networks

Qi GongWei KangFariba Fahroo

Journal:   Systems & Control Letters Year: 2023 Vol: 175 Pages: 105508-105508
JOURNAL ARTICLE

Rates of approximation by ReLU shallow neural networks

Tong MaoDing‐Xuan Zhou

Journal:   Journal of Complexity Year: 2023 Vol: 79 Pages: 101784-101784
JOURNAL ARTICLE

Approximation Algorithms for Training One-Node ReLU Neural Networks

Santanu S. DeyGuanyi WangYao Xie

Journal:   IEEE Transactions on Signal Processing Year: 2020 Vol: 68 Pages: 6696-6706
© 2026 ScienceGate Book Chapters — All rights reserved.