Peng WangZhe WangXiaowang ZhangKewen WangZhiyong Feng
Named entity recognition (NER) is the task of identifying and classifying named entities from texts. NER can benefit from linguistic dependency information, yet existing NER models can only utilize such information on datasets where dependency annotations are readily available. Dependency parsing (DP) models can be used to generate annotations, which are trained independent of the NER task and can cause error propagation to NER. In this paper, we propose a joint NER and DP model through multi-task learning, which allows the NER and DP modules to benefit from the joint training and provides an end-to-end solution to dependency-guided NER. Our model JOINDER uses a shared contextualized embedder, a word encoder, a biaffine dependency classifier, and a multi-hop dependency-guided NER. Experiments on several standard datasets in four languages show the effectiveness of joint learning and the outstanding performance of JOINDER compared to existing models. Moreover, our model can transfer dependency knowledge to other datasets with no dependency annotat.
Jenny Rose FinkelChristopher D. Manning
Peng WangZhe WangXiaowang ZhangKewen WangZhiyong Feng
Chenxiao DouXianghui SunYaoshu WangYunjie JiBaochang MaXiangang Li
Sheping ZhaiHuizhen WangGou DanYun Chai