JOURNAL ARTICLE

Stronger Baselines for Grammatical Error Correction Using a Pretrained Encoder-Decoder Model

Abstract

Studies on grammatical error correction (GEC) have reported on the effectiveness of pretraining a Seq2Seq model with a large amount of pseudodata. However, this approach requires time-consuming pretraining of GEC because of the size of the pseudodata. In this study, we explored the utility of bidirectional and auto-regressive transformers (BART) as a generic pretrained encoder-decoder model for GEC. With the use of this generic pretrained model for GEC, the time-consuming pretraining can be eliminated. We find that monolingual and multilingual BART models achieve high performance in GEC, with one of the results being comparable to the current strong results in English GEC.

Keywords:
Encoder Computer science Transformer Decoding methods Artificial intelligence Speech recognition Error detection and correction Natural language processing Algorithm Voltage Engineering

Metrics

25
Cited By
2.64
FWCI (Field Weighted Citation Impact)
0
Refs
0.91
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Text Readability and Simplification
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Stronger Baselines for Grammatical Error Correction Using a Pretrained Encoder-Decoder Model

Satoru Katsumata

Journal:   Journal of Natural Language Processing Year: 2021 Vol: 28 (1)Pages: 276-280
JOURNAL ARTICLE

A Multilayer Convolutional Encoder-Decoder Neural Network for Grammatical Error Correction

Shamil ChollampattHwee Tou Ng

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2018 Vol: 32 (1)
JOURNAL ARTICLE

A Multilayer Convolutional Encoder-Decoder Neural Network for Grammatical Error Correction

Shamil ChollampattHwee Tou Ng

Journal:   arXiv (Cornell University) Year: 2018 Vol: 32 (1)Pages: 5755-5762
© 2026 ScienceGate Book Chapters — All rights reserved.