Yi RenAn DuTao NingAyesha Siddiqua
Named entity recognition is an important task in natural language processing, Chinese-named entities are harder to identify because of uncertainty and complexity of border composition. The Self-Attention mechanism cannot capture the direction information and relative positions between words. The loss of position information and the lack of direction information will inevitably affect the effect of named entity recognition task. In this work, we try to improve the recognition effect by introducing a graph neural network, encode word information and location information into the model by building a graph structure combined with a dictionary, and take full advantage of the pretraining effect of Lattice LSTM model to eliminate entity ambiguity. We use Transformer encoder to further extract contextual features. Finally, the conditional random field is introduced to obtain the optimal tag sequence. The experimental results show that the improved model outperforms other comparative network models in the evaluation indicators at Recall, Precision, and F1 score.
Wenxia XieHanyan QinLijuan HouXiankun Zhang
Ruoyu ZhangWenpeng LüShoujin WangXueping PengRui YuYuan Gao
Sheping ZhaiGou DanHuizhen WangYun Chai