Yanmin ChenHao WangRuijun SunEnhong Chen
As we know, semantic matching plays a very important role in the field of natural language processing, and is the basis of many other language applications, such as question answer matching, chat robots and customer service. Though large efforts have been made, they mainly focus on the semantic matching of a single sentence pair. However, in many real-world scenarios, a sentence tends to have semantic associations with multiple sentences. So these relational information between sentences has been largely ignored in previous research work. To this end, in this paper, we propose a novel Self-attention Relational Sentence Semantic Matching (SR-SSM) framework in order to jointly utilize the sentences and relational information to achieve better semantic matching. Specifically, we first utilize the LSTM network to obtain the original sentence representation. Then we incorporate the information of adjacent sentences to obtain context-aware sentence representation through the self-attention mechanism. Finally, we adopt a multi-layer perceptron to model interactions between sentence pairs and predict the semantic similarity. Experiments on public datasets have validated the effectiveness of SR-SSM with a significant margin compared with the state-of-the-art baselines.
Baosong YangJian LiDerek F. WongLidia S. ChaoXing WangZhaopeng Tu
Gang DongPengyu HuZihao DongHao WanXiuli Shao
Yao DengXianfeng LiMengyan ZhangXin LuXia Sun
Li HuangWenyu ChenYuguo LiuShuai HouHong Qu
Eugene Tan Boon HongYung-Wey ChongTat‐Chee WanKok‐Lim Alvin Yau