Abstract

Using a neural sampling approach, networks of stochastic spiking neurons, interconnected with plastic synapses, have been used to construct computational machines such as Restricted Boltzmann Machines (RBMs). Previous work towards building such networks achieved lower performances than traditional RBMs. More recently, Synaptic Sampling Machines (SSMs) were shown to outperform equivalent RBMs. In Synaptic Sampling Machines (SSMs), the stochasticity for the sampling is generated at the synapse. Stochastic synapses play the dual role of a regularizer during learning and an efficient mechanism for implementing stochasticity in neural networks over a wide dynamic range. In this paper we show that SSMs with stochastic synapses implemented in FPGA-based spiking neural networks can obtain a high accuracy in classifying MNIST handwritten digit database. We compare classification accuracy for different bit precision for stochastic and non-stochastic synapses and further argue that stochastic synapses have the same effect as synapses with higher bit precision but require significantly lower computational resources.

Keywords:
MNIST database Computer science Spiking neural network Sampling (signal processing) Artificial neural network Stochastic computing Synapse Boltzmann machine Artificial intelligence Neuromorphic engineering Stochastic neural network Machine learning Recurrent neural network Neuroscience

Metrics

6
Cited By
0.48
FWCI (Field Weighted Citation Impact)
9
Refs
0.72
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Memory and Neural Computing
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Neural dynamics and brain function
Life Sciences →  Neuroscience →  Cognitive Neuroscience
Neural Networks and Reservoir Computing
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.