JOURNAL ARTICLE

Nonconvex Sparse Regularization for Deep Neural Networks and Its Optimality

Ilsang OhnYongdai Kim

Year: 2021 Journal:   Neural Computation Vol: 34 (2)Pages: 476-517   Publisher: The MIT Press

Abstract

Abstract Recent theoretical studies proved that deep neural network (DNN) estimators obtained by minimizing empirical risk with a certain sparsity constraint can attain optimal convergence rates for regression and classification problems. However, the sparsity constraint requires knowing certain properties of the true model, which are not available in practice. Moreover, computation is difficult due to the discrete nature of the sparsity constraint. In this letter, we propose a novel penalized estimation method for sparse DNNs that resolves the problems existing in the sparsity constraint. We establish an oracle inequality for the excess risk of the proposed sparse-penalized DNN estimator and derive convergence rates for several learning tasks. In particular, we prove that the sparse-penalized estimator can adaptively attain minimax convergence rates for various nonparametric regression problems. For computation, we develop an efficient gradient-based optimization algorithm that guarantees the monotonic reduction of the objective function.

Keywords:
Minimax Estimator Constraint (computer-aided design) Mathematical optimization Regularization (linguistics) Computation Artificial neural network Oracle Mathematics Empirical risk minimization Convergence (economics) Rate of convergence Computer science Algorithm Artificial intelligence

Metrics

16
Cited By
1.96
FWCI (Field Weighted Citation Impact)
51
Refs
0.83
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Sparse and Compressive Sensing Techniques
Physical Sciences →  Engineering →  Computational Mechanics
Statistical Methods and Inference
Physical Sciences →  Mathematics →  Statistics and Probability
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.