JOURNAL ARTICLE

Cross-Lingual Summarization: English - Bahasa Indonesia

Abstract

Progress of abstractive summarization has been accelerated since the introduction of sequence-to-sequence neural networks. Summarization is no longer limited to selecting words or sentences that exist in the source document as in the extractive approach, but can generate completely new words or sentences that have never appeared in the source document. Big push came from machine translation research with the introduction of attention mechanisms. Attention mechanism is the key to the information bottleneck problem in encoder-decoder model. Cross-Lingual Summarization (CLS) is the task of generating a summary in target language from source document in different language. Traditional methods split this task into two steps: summarization and translation. This paper describes a study on CLS without explicitly using translator, thereby reducing one step as in existing method. We incorporate multilingual embeddings in sequence-to-sequence neural networks with attention mechanisms to handle this task. Multilingual embeddings are used to represent words as if the source language and the target language are the same language. Experiments show comparable performance between monolingual summarization and cross-lingual summarization in Amazon Fine Food review data indicated by ROUGE scores which are only 1–2 points apart.

Keywords:
Automatic summarization Computer science Natural language processing Artificial intelligence Linguistics

Metrics

2
Cited By
0.28
FWCI (Field Weighted Citation Impact)
54
Refs
0.65
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Text Analysis Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.