JOURNAL ARTICLE

News Text Generation Method Integrating Pointer-Generator Network with Bidirectional Auto-Regressive Transformer

Abstract

Text generation technology in natural language processing has gained prominence, notably in automating comprehensible text production for journalism, a research hotspot. Generating fluid, readable news content, which coherently integrates key news elements, poses a significant challenge. This technology's potential lies in reducing journalists' repetitive work, enhancing creative contributions. Common text generation models, however, often suffer from large parameter sizes and produce repetitive or inaccurate content. Addressing these challenges in news text generation, we present a lightweight model parameterization method, incorporating the BART (Bidirectional and Auto-Regressive Transformers) model with a pointer-generator network. Experimental results demonstrate that our model, using fewer parameters compared to baseline models, achieves a Rouge-L score of 57.6% on real-world news datasets, thereby demonstrating enhanced performance.

Keywords:
Computer science Transformer Text generation Natural language generation Pointer (user interface) Artificial intelligence Natural language processing Natural language Engineering Electrical engineering Voltage

Metrics

6
Cited By
1.53
FWCI (Field Weighted Citation Impact)
19
Refs
0.83
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.