JOURNAL ARTICLE

Improving the robustness of machine reading comprehension via contrastive learning

Jianzhou FengJiawei SunDi ShaoJinman Cui

Year: 2022 Journal:   Applied Intelligence Vol: 53 (8)Pages: 9103-9114   Publisher: Springer Science+Business Media

Abstract

Abstract Pre-trained language models achieve high performance on machine reading comprehension task, but these models lack robustness and are vulnerable to adversarial samples. Most of the current methods for improving model robustness are based on data enrichment. However, these methods do not solve the problem of poor context representation of the machine reading comprehension model. We find that context representation plays a key role in the robustness of the machine reading comprehension model, dense context representation space results in poor model robustness. To deal with this, we propose a Multi-task machine Reading Comprehension learning framework via Contrastive Learning. Its main idea is to improve the context representation space encoded by the machine reading comprehension models through contrastive learning. This special contrastive learning we proposed called Contrastive Learning in Context Representation Space(CLCRS). CLCRS samples sentences containing context information from the context as positive and negative samples, expanding the distance between the answer sentence and other sentences in the context. Therefore, the context representation space of the machine reading comprehension model has been expanded. The model can better distinguish between sentence containing correct answers and misleading sentence. Thus, the robustness of the model is improved. Experiment results on adversarial datasets show that our method exceeds the comparison models and achieves state-of-the-art performance.

Keywords:
Computer science Robustness (evolution) Reading comprehension Artificial intelligence Sentence Natural language processing Comprehension Machine learning Reading (process) Linguistics

Metrics

8
Cited By
1.57
FWCI (Field Weighted Citation Impact)
26
Refs
0.81
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

To Answer or Not To Answer? Improving Machine Reading Comprehension Model with Span-based Contrastive Learning

Yunjie JiLiangyu ChenChenxiao DouBaochang MaXiangang Li

Journal:   Findings of the Association for Computational Linguistics: NAACL 2022 Year: 2022 Pages: 1292-1300
JOURNAL ARTICLE

Contrastive Learning between Classical and Modern Chinese for Classical Chinese Machine Reading Comprehension

Maofu LiuJunyi XiangXia XuHuijun Hu

Journal:   ACM Transactions on Asian and Low-Resource Language Information Processing Year: 2022 Vol: 22 (2)Pages: 1-22
JOURNAL ARTICLE

Improving Machine Reading Comprehension with Multi-Task Learning and Self-Training

Jianquan OuyangMengen Fu

Journal:   Mathematics Year: 2022 Vol: 10 (3)Pages: 310-310
© 2026 ScienceGate Book Chapters — All rights reserved.