JOURNAL ARTICLE

Dynamic Regularization on Activation Sparsity for Neural Network Efficiency Improvement

Qing YangJiachen MaoZuoguan Wang“Helen” Li Hai

Year: 2021 Journal:   ACM Journal on Emerging Technologies in Computing Systems Vol: 17 (4)Pages: 1-16   Publisher: Association for Computing Machinery

Abstract

When deploying deep neural networks in embedded systems, it is crucial to decrease the model size and computational complexity for improving the execution speed and efficiency. In addition to conventional compression techniques, e.g., weight pruning and quantization, removing unimportant activations can also dramatically reduce the amount of data communication and the computation cost. Unlike weight parameters, the pattern of activations is directly related to input data and thereby changes dynamically. To regulate the dynamic activation sparsity (DAS), in this work, we propose a generic low-cost approach based on winners-take-all (WTA) dropout technique. The network enhanced by the proposed WTA dropout, namely DASNet , features structured activation sparsity with an improved sparsity level. Compared to the static feature map pruning methods, DASNets provide better computation cost reduction. The WTA dropout technique can be easily applied in deep neural networks without incurring additional training variables. More importantly, DASNet can be seamlessly integrated with other compression techniques, such as weight pruning and quantization, without compromising accuracy. Our experiments on various networks and datasets present significant runtime speedups with negligible accuracy losses.

Keywords:
Computer science Quantization (signal processing) Computation Artificial neural network Pruning Dropout (neural networks) Regularization (linguistics) Deep neural networks Speedup Minification Artificial intelligence Algorithm Machine learning Parallel computing

Metrics

3
Cited By
0.20
FWCI (Field Weighted Citation Impact)
20
Refs
0.48
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Neural Network Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Machine Learning and ELM
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.