JOURNAL ARTICLE

ReLU Neural Network Approximation to Discontinuous Functions

Abstract

This work studies approximation properties of ReLU neural networks to discontinuous functions.\section{Least-squares neural network (LSNN) method for linear advection-reaction equation: discontinuity interface}Z. Cai et al. studied the least-squares ReLU neural network (LSNN) method for solving a linear advection-reaction equation with discontinuous solution in \cite{cai2021least}. The method is based on a least-squares formulation and uses a new class of approximating functions: ReLU neural network (NN) functions. A critical and additional component of the LSNN method, differing from other NN-based methods, is the introduction of a properly designed and physics preserved discrete differential operator.In this work, we study the LSNN method for problems with discontinuity interfaces. First, we show that ReLU NN functions with depth $\lceil \log_2(d+1)\rceil+1$ can approximate any $d$-dimensional step function on a discontinuity interface generated by a vector field as streamlines with any prescribed accuracy. By decomposing the solution into continuous and discontinuous parts, we prove theoretically that the discretization error of the LSNN method using ReLU NN functions with depth $\lceil \log_2(d+1)\rceil+1$ is mainly determined by the continuous part of the solution provided that the solution jump is constant. Numerical results for both two- and three-dimensional test problems with various discontinuity interfaces show that the LSNN method with enough layers is accurate and does not exhibit the common Gibbs phenomena along discontinuity interfaces.\section{Least-squares neural network (LSNN) method for linear advection-reaction equation: non-constant jumps}In Chapter \ref{non-constant jumps}, we show theoretically that the LSNN method is also capable of accurately approximating non-constant jumps along discontinuous interfaces that are not necessarily straight lines. Theoretical results are confirmed through multiple numerical examples with $d=2,3$ and various non-constant jumps and interface shapes, showing that the LSNN method with $\lceil \log_2(d+1)\rceil+1$ layers approximates solutions accurately with degrees of freedom less than that of mesh-based methods and without the common Gibbs phenomena along discontinuous interfaces having non-constant jumps.\section{ReLU neural network approximation to piecewise constant functions}Chapter \ref{pconst} studies the approximation property of ReLU neural networks (NNs) to piecewise constant functions with unknown interfaces in bounded regions in $\mathbb{R}^d$. Under the assumption that the discontinuity interface $\Gamma$ may be approximated by a connected series of hyperplanes with a prescribed accuracy $\varepsilon >0$, we show that a three-layer ReLU NN is sufficient to accurately approximate any piecewise constant function and establish its error bound. Moreover, if the discontinuity interface is convex, an analytical formula of the ReLU NN approximation with exact weights and biases is provided.

Keywords:
Discontinuity (linguistics) Artificial neural network Discretization Jump Extrapolation Function (biology) Classification of discontinuities Approximation error

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.47
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Electrical Fault Detection and Protection
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
HVDC Systems and Fault Protection
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Power Systems Fault Detection
Physical Sciences →  Engineering →  Control and Systems Engineering

Related Documents

JOURNAL ARTICLE

ReLU Neural Network Approximation to Discontinuous Functions

Choi, Junpyo

Journal:   OPAL (Open@LaTrobe) (La Trobe University) Year: 2025
JOURNAL ARTICLE

Constructive Deep ReLU Neural Network Approximation

Lukas HerrmannJoost A. A. OpschoorChristoph Schwab

Journal:   Journal of Scientific Computing Year: 2022 Vol: 90 (2)
JOURNAL ARTICLE

Approximation of compositional functions with ReLU neural networks

Qi GongWei KangFariba Fahroo

Journal:   Systems & Control Letters Year: 2023 Vol: 175 Pages: 105508-105508
JOURNAL ARTICLE

The Construction and Approximation of ReLU Neural Network Operators

Hengjie ChenDansheng YuZhong Li

Journal:   Journal of Function Spaces Year: 2022 Vol: 2022 Pages: 1-10
© 2026 ScienceGate Book Chapters — All rights reserved.