Recurrent neural networks (RNNs) are a powerful model for sequential data. RNNs that use long short-term memory (LSTM) cells have proven effective in handwriting recognition, language modeling, speech recognition, and language comprehension tasks. In this study, we propose LSTM conditional random fields (LSTM-CRF); it is an LSTM-based RNN model that uses output-label dependencies with transition features and a CRF-like sequence-level objective function. We also propose variations to the LSTM-CRF model using a gate recurrent unit (GRU) and structurally constrained recurrent network (SCRN). Empirical results reveal that our proposed models attain state-of-the-art performance for named entity recognition.
Pedro Vitor Quinta de CastroNádia Félix Felipe da SilvaAnderson da Silva Soares
Donghuo ZengChengjie SunLei LinBingquan Liu
Xuemin YangZhihong GaoYongmin LiChuandi PanRonggen YangLejun GongGeng Yang