JOURNAL ARTICLE

Advancing Spiking Neural Networks Toward Deep Residual Learning

Yifan HuLei DengYujie WuMan YaoGuoqi Li

Year: 2024 Journal:   IEEE Transactions on Neural Networks and Learning Systems Vol: 36 (2)Pages: 2353-2367   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Despite the rapid progress of neuromorphic computing, inadequate capacity and insufficient representation power of spiking neural networks (SNNs) severely restrict their application scope in practice. Residual learning and shortcuts have been evidenced as an important approach for training deep neural networks, but rarely did previous work assessed their applicability to the specifics of SNNs. In this article, we first identify that this negligence leads to impeded information flow and the accompanying degradation problem in a spiking version of vanilla ResNet. To address this issue, we propose a novel SNN-oriented residual architecture termed MS-ResNet, which establishes membrane-based shortcut pathways, and further proves that the gradient norm equality can be achieved in MS-ResNet by introducing block dynamical isometry theory, which ensures the network can be well-behaved in a depth-insensitive way. Thus, we are able to significantly extend the depth of directly trained SNNs, e.g., up to 482 layers on CIFAR-10 and 104 layers on ImageNet, without observing any slight degradation problem. To validate the effectiveness of MS-ResNet, experiments on both frame-based and neuromorphic datasets are conducted. MS-ResNet104 achieves a superior result of 76.02% accuracy on ImageNet, which is the highest to the best of our knowledge in the domain of directly trained SNNs. Great energy efficiency is also observed, with an average of only one spike per neuron needed to classify an input sample. We believe our powerful and scalable models will provide strong support for further exploration of SNNs.

Keywords:
Spiking neural network Computer science Neuromorphic engineering FLOPS Residual Artificial intelligence Residual neural network Scalability Artificial neural network Dependency (UML) Machine learning Deep neural networks Deep learning Scope (computer science) Pattern recognition (psychology) Algorithm Parallel computing

Metrics

93
Cited By
33.96
FWCI (Field Weighted Citation Impact)
78
Refs
1.00
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Memory and Neural Computing
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Ferroelectric and Negative Capacitance Devices
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Neural dynamics and brain function
Life Sciences →  Neuroscience →  Cognitive Neuroscience

Related Documents

JOURNAL ARTICLE

Deep Learning In Spiking Neural Networks

Kasabov

Journal:   OPAL (Open@LaTrobe) (La Trobe University) Year: 2018
JOURNAL ARTICLE

Deep Learning In Spiking Neural Networks

Kasabov

Journal:   Zenodo (CERN European Organization for Nuclear Research) Year: 2018
JOURNAL ARTICLE

Deep Learning for Spiking Neural Networks

Yusuke SakemiKai Morino

Journal:   SEISAN KENKYU Year: 2019 Vol: 71 (2)Pages: 159-167
JOURNAL ARTICLE

Spiking Deep Residual Networks

Yangfan HuHuajin TangGang Pan

Journal:   IEEE Transactions on Neural Networks and Learning Systems Year: 2021 Vol: 34 (8)Pages: 5200-5205
BOOK-CHAPTER

Learning algorithms for deep spiking neural networks

Hong QuXiaoling LuoYi Zhang

Elsevier eBooks Year: 2024 Pages: 95-115
© 2026 ScienceGate Book Chapters — All rights reserved.