JOURNAL ARTICLE

Recurrent Neural Networks: Associative Memory and Optimization

Ke-Lin Du

Year: 2011 Journal:   Journal of Information Technology & Software Engineering Vol: 01 (02)   Publisher: OMICS Publishing Group

Abstract

Due to feedback connections, recurrent neural networks (RNNs) are dynamic models. RNNs can provide more compact structure for approximating dynamic systems compared to feedforward neural networks (FNNs). For some RNN models such as the Hopfield model and the Boltzmann machine, the fixed-point property of the dynamic systems can be used for optimization and associative memory. The Hopfield model is the most important RNN model, and the Boltzmann machine as well as some other stochastic dynamic models are proposed as its generalization. These models are especially useful for dealing with combinatorial optimization problems (COPs), which are notorious NPcomplete problems. In this paper, we provide a state-of-the-art introduction to these RNN models, their learning algorithms as well as their analog implementations. Associative memory, COPs, simulated annealing (SA), chaotic neural networks and multilevel Hopfield models are also important topics treated in this paper.

Keywords:
Computer science Content-addressable memory Associative property Artificial neural network Bidirectional associative memory Recurrent neural network Artificial intelligence Cognitive science Machine learning Psychology

Metrics

8
Cited By
0.39
FWCI (Field Weighted Citation Impact)
121
Refs
0.73
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Fuzzy Logic and Control Systems
Physical Sciences →  Computer Science →  Artificial Intelligence
Metaheuristic Optimization Algorithms Research
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.