JOURNAL ARTICLE

LSTM-CRF Models for Named Entity Recognition

Changki Lee

Year: 2017 Journal:   IEICE Transactions on Information and Systems Vol: E100.D (4)Pages: 882-887   Publisher: Institute of Electronics, Information and Communication Engineers

Abstract

Recurrent neural networks (RNNs) are a powerful model for sequential data. RNNs that use long short-term memory (LSTM) cells have proven effective in handwriting recognition, language modeling, speech recognition, and language comprehension tasks. In this study, we propose LSTM conditional random fields (LSTM-CRF); it is an LSTM-based RNN model that uses output-label dependencies with transition features and a CRF-like sequence-level objective function. We also propose variations to the LSTM-CRF model using a gate recurrent unit (GRU) and structurally constrained recurrent network (SCRN). Empirical results reveal that our proposed models attain state-of-the-art performance for named entity recognition.

Keywords:
Recurrent neural network Computer science Conditional random field Artificial intelligence Language model Speech recognition Named-entity recognition Long short term memory Sequence (biology) Handwriting recognition Natural language processing Artificial neural network Machine learning Pattern recognition (psychology) Feature extraction

Metrics

26
Cited By
1.60
FWCI (Field Weighted Citation Impact)
14
Refs
0.85
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Speech Recognition and Synthesis
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.