JOURNAL ARTICLE

Analysis of biologically inspired artificial neural networks

Armen Stepanyants

Year: 2009 Journal:   Frontiers in Systems Neuroscience Vol: 3   Publisher: Frontiers Media

Abstract

Event Abstract Back to Event Analysis of biologically inspired artificial neural networks Connectivity in most cortical networks is sparse and mainly excitatory, while the distribution of synaptic strengths has been shown to exhibit a slow non-exponential decay. To explore the reasons behind these three network features, we analyze robust artificial networks of excitatory and inhibitory McCulloch and Pitts neurons. Over the years, there has been a great deal of interest in McCulloch and Pitts model networks and their capacity. However, most of the theoretical and computational models that have been studied thus far ignore biological constraints on the network architecture. Using the knowledge about synaptic connectivity in the neocortex we investigated artificial neural networks with biologically plausible connectivity properties. We analyzed networks of excitatory and inhibitory neurons ranging from 100 to 1000 cells in size - numbers that correspond to different cortical units. We hypothesized that cortical networks must be robust, metabolically inexpensive (low overall synaptic strength), and, at the same time, have large memory storage capacity. Our computational results first showed that robust artificial neural networks that minimize the overall strength of synaptic connectivity must be sparsely interconnected, which is in agreement with the experimental data. Second, the capacity of such networks increases with the fraction of inhibitory neurons (in the 0-50% range), accompanied with an increase in the overall synaptic strength. As a result of this tradeoff between the network capacity and the overall synaptic strength, the optimal artificial networks must contain only a small fraction of inhibitory neurons, consistent with real cortical networks. Finally, we showed that the distribution of synaptic strengths in these optimal networks exhibits a slow non-exponential decay, again in agreement with experimental observations. Conference: Computational and systems neuroscience 2009, Salt Lake City, UT, United States, 26 Feb - 3 Mar, 2009. Presentation Type: Poster Presentation Topic: Poster Presentations Citation: (2009). Analysis of biologically inspired artificial neural networks. Front. Syst. Neurosci. Conference Abstract: Computational and systems neuroscience 2009. doi: 10.3389/conf.neuro.06.2009.03.062 Copyright: The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers. They are made available through the Frontiers publishing platform as a service to conference organizers and presenters. The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated. Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed. For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions. Received: 30 Jan 2009; Published Online: 30 Jan 2009. Login Required This action requires you to be registered with Frontiers and logged in. To register or login click here. Abstract Info Abstract The Authors in Frontiers Google Google Scholar PubMed Related Article in Frontiers Google Scholar PubMed Abstract Close Back to top Javascript is disabled. Please enable Javascript in your browser settings in order to see all the content on this page.

Keywords:
Neocortex Artificial neural network Inhibitory postsynaptic potential Excitatory postsynaptic potential Computer science Neuroscience Artificial intelligence Biology

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.02
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Memory and Neural Computing
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.