JOURNAL ARTICLE

An Empirical Evaluation of Rule Extraction from Recurrent Neural Networks

Abstract

Rule extraction from black box models is critical in domains that require model validation before implementation, as can be the case in credit scoring and medical diagnosis. Though already a challenging problem in statistical learning in general, the difficulty is even greater when highly nonlinear, recursive models, such as recurrent neural networks (RNNs), are fit to data. Here, we study the extraction of rules from second-order RNNs trained to recognize the Tomita grammars. We show that production rules can be stably extracted from trained RNNs and that in certain cases, the rules outperform the trained RNNs.

Keywords:
Recurrent neural network Computer science Artificial intelligence Black box Machine learning Artificial neural network Rule-based machine translation

Metrics

56
Cited By
7.94
FWCI (Field Weighted Citation Impact)
42
Refs
0.97
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Neural Networks and Applications
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Rule Extraction from Recurrent Neural Networks: ATaxonomy and Review

Henrik Jacobsson

Journal:   Neural Computation Year: 2005 Vol: 17 (6)Pages: 1223-1263
BOOK-CHAPTER

Rule Extraction Methods from Neural Networks

Sergey YarushevAlexey AverkinV. V. Kosterev

Advances in intelligent systems and computing Year: 2023 Pages: 1-9
BOOK-CHAPTER

Linguistic Rule Extraction from Neural Networks

Year: 2006 Pages: 251-275
BOOK-CHAPTER

Rule Extraction from RBF Neural Networks

Advanced information and knowledge processing Year: 2006 Pages: 157-187
JOURNAL ARTICLE

Rule extraction from binary neural networks

Marco Muselli

Year: 1999 Vol: 1999 Pages: 515-520
© 2026 ScienceGate Book Chapters — All rights reserved.