JOURNAL ARTICLE

SASCHA—Sparsity-Aware Stochastic Computing Hardware Architecture for Neural Network Acceleration

Wojciech RomaszkanTianmu LiPuneet Gupta

Year: 2022 Journal:   IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems Vol: 41 (11)Pages: 4169-4180   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Stochastic computing (SC) has recently emerged as a promising method for efficient machine learning acceleration. Its high compute density, affinity with dense linear algebra primitives, and approximation properties have an uncanny level of synergy with the deep neural network computational requirements. However, there is a conspicuous lack of works trying to integrate SC hardware with sparsity awareness, which has brought significant performance improvements to conventional architectures. In this work, we identify why common sparsity-exploiting techniques are not easily applicable to SC accelerators and propose a new architecture—SASCHA—sparsity-aware SC hardware architecture for the neural network acceleration that addresses those issues. SASCHA encompasses a set of techniques that make utilizing sparsity in inference practical for different types of SC computation. At 90% weight sparsity, SASCHA can be up to $6.5\times $ faster and $5.5\times $ more energy-efficient than comparable dense SC accelerators with a similar area without sacrificing the dense network throughput. SASCHA also outperforms sparse fixed-point accelerators by up to $4\times $ in terms of latency. To the best of our knowledge, SASCHA is the first SC accelerator architecture oriented around sparsity.

Keywords:
Computer science Hardware acceleration Notation Artificial neural network Inference Belief propagation Acceleration Hardware architecture Theoretical computer science Parallel computing Algorithm Artificial intelligence Mathematics Arithmetic Programming language Software

Metrics

8
Cited By
1.71
FWCI (Field Weighted Citation Impact)
54
Refs
0.79
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Error Correcting Code Techniques
Physical Sciences →  Computer Science →  Computer Networks and Communications
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Ferroelectric and Negative Capacitance Devices
Physical Sciences →  Engineering →  Electrical and Electronic Engineering

Related Documents

JOURNAL ARTICLE

Hardware Architecture of Stochastic Computing Neural Network

SONG Yinjie CHEN Yuhao

Journal:   DOAJ (DOAJ: Directory of Open Access Journals) Year: 2021
JOURNAL ARTICLE

Hardware aware Convolutional Neural Network (CNN) training acceleration

Vink, Diederik

Journal:   Imperial College Research Computing Service Data Repository Year: 2023
© 2026 ScienceGate Book Chapters — All rights reserved.