JOURNAL ARTICLE

Multi-Encoder Transformer for Korean Abstractive Text Summarization

Youhyun Shin

Year: 2023 Journal:   IEEE Access Vol: 11 Pages: 48768-48782   Publisher: Institute of Electrical and Electronics Engineers

Abstract

In this paper, we propose a Korean abstractive text summarization approach that uses a multi -encoder transformer. Recently, in many natural language processing (NLP) tasks, the use of the pre-trained language models (PLMs) for transfer learning has achieved remarkable performance. In particular, transformer-based models such as Bidirectional Encoder Representations from Transformers (BERT) are used for pre-training and applied to downstream tasks, showing state-of-the-art performance including abstractive text summarization. However, existing text summarization models usually use one pre-trained model per model architecture, meaning that it becomes necessary to choose one PLM at a time. For PLMs applicable to Korean abstractive text summarization, there are publicly available BERT-based pre-trained Korean models that offer different advantages such as Multilingual BERT, KoBERT, HanBERT, and KorBERT. We assume that if these PLMs could be leveraged simultaneously, better performance would be obtained. We propose a model that uses multiple encoders which are capable of leveraging multiple pre-trained models to create an abstractive summary. We evaluate our method using three benchmark Korean abstractive summarization datasets, each named Law (AI-Hub), News (AI-Hub), and News (NIKL) datasets. Experimental results show that the proposed multi-encoder model variations outperform single -encoder models. We find the empirically best summarization model by determining the optimal input combination when leveraging multiple PLMs with the multi-encoder method.

Keywords:
Automatic summarization Computer science Encoder Transformer Artificial intelligence Natural language processing Language model Benchmark (surveying) Machine learning

Metrics

19
Cited By
4.85
FWCI (Field Weighted Citation Impact)
41
Refs
0.94
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Computational and Text Analysis Methods
Social Sciences →  Social Sciences →  General Social Sciences

Related Documents

BOOK-CHAPTER

Keyword-Aware Encoder for Abstractive Text Summarization

Tianxiang HuJingxi LiangWei YeShikun Zhang

Lecture notes in computer science Year: 2021 Pages: 37-52
JOURNAL ARTICLE

Transformer-based abstractive indonesian text summarization

Miracle AureliaSheila MonicaAbba Suganda Girsang

Journal:   International Journal of Informatics and Communication Technology (IJ-ICT) Year: 2024 Vol: 13 (3)Pages: 388-388
JOURNAL ARTICLE

IWM-LSTM encoder for abstractive text summarization

Ravindra GangundiRajeswari Sridhar

Journal:   Multimedia Tools and Applications Year: 2024 Vol: 84 (9)Pages: 5883-5904
JOURNAL ARTICLE

Abstractive Text Summarization in English Using Swin Transformer Encoder based Recurrent Neural Network

Journal:   International journal of intelligent engineering and systems Year: 2025 Vol: 18 (6)Pages: 904-915
© 2026 ScienceGate Book Chapters — All rights reserved.