JOURNAL ARTICLE

Generalisation of Feed-Forward Neural Networks and Recurrent Neural Networks

Rui Wang

Year: 2024 Journal:   Applied and Computational Engineering Vol: 40 (1)Pages: 242-246

Abstract

This paper presents an in-depth analysis of Feed-Forward Neural Networks (FNNs) and Recurrent Neural Networks (RNNs), two powerful models in the field of artificial intelligence. Understanding these models and their applications is crucial for harnessing their potential. The study addresses the need to comprehend the unique characteristics and architectures of FNNs and RNNs. These models excel at processing sequential and temporal data, making them indispensable in tasks. Furthermore, the paper emphasises the importance of variables in FNNs and proposes a novel method to rank the importance of independent variables in predicting the output variable. By understanding the relationship between inputs and outputs, valuable insights can be gained into the underlying patterns and mechanisms driving the system being modelled. Additionally, the research explores the impact of initial weights on model performance. Contrary to conventional beliefs, the study provides evidence that neural networks with random weights can achieve competitive performance, particularly in situations with limited training datasets. This finding challenges the traditional notion that careful initialization is necessary for neural networks to perform well. In summary, this paper provides a comprehensive analysis of FNNs and RNNs while highlighting the importance of understanding the relationship between variables and the impact of initial weights on model performance. By shedding light on these crucial aspects, this research contributes to the advancement and effective utilisation of neural networks, paving the way for improved predictions and insights in various domains.

Keywords:
Computer science Artificial neural network Initialization Recurrent neural network Artificial intelligence Machine learning Variable (mathematics) Feedforward neural network Rank (graph theory) Field (mathematics) Mathematics

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
4
Refs
0.01
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence
Stock Market Forecasting Methods
Social Sciences →  Decision Sciences →  Management Science and Operations Research
EEG and Brain-Computer Interfaces
Life Sciences →  Neuroscience →  Cognitive Neuroscience

Related Documents

BOOK-CHAPTER

Feed-forward Neural Networks

Yoav Goldberg

Synthesis lectures on human language technologies Year: 2017 Pages: 41-49
BOOK-CHAPTER

Feed Forward Neural Networks

Nikhil Ketkar

Apress eBooks Year: 2017 Pages: 17-33
BOOK-CHAPTER

Feed-Forward Neural Networks

Umberto Michelucci

Apress eBooks Year: 2022 Pages: 61-109
BOOK-CHAPTER

Feed-Forward Neural Networks

Nikhil KetkarJojo Moolayil

Apress eBooks Year: 2021 Pages: 93-131
BOOK-CHAPTER

Feed-Forward Neural Networks

Barry K. LavineTodd Blank

Elsevier eBooks Year: 2009 Pages: 571-586
© 2026 ScienceGate Book Chapters — All rights reserved.