JOURNAL ARTICLE

Low-power neuromorphic speech recognition engine with coarse-grain sparsity

Abstract

In recent years, we have seen a surge of interest in neuromorphic computing and its hardware design for cognitive applications. In this work, we present new neuromorphic architecture, circuit, and device co-designs that enable spike-based classification for speech recognition task. The proposed neuromorphic speech recognition engine supports a sparsely connected deep spiking network with coarse granularity, leading to large memory reduction with minimal index information. Simulation results show that the proposed deep spiking neural network accelerator achieves phoneme error rate (PER) of 20.5% for TIMIT database, and consume 2.57mW in 40nm CMOS for real-time performance. To alleviate the memory bottleneck, the usage of non-volatile memory is also evaluated and discussed.

Keywords:
Neuromorphic engineering Computer science Bottleneck Granularity Spiking neural network Spike (software development) Artificial neural network Word error rate Task (project management) Speech recognition Artificial intelligence Computer architecture Embedded system Engineering

Metrics

6
Cited By
0.53
FWCI (Field Weighted Citation Impact)
12
Refs
0.68
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Memory and Neural Computing
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Ferroelectric and Negative Capacitance Devices
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Neural Networks and Reservoir Computing
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.