Text generation technology in natural language processing has gained prominence, notably in automating comprehensible text production for journalism, a research hotspot. Generating fluid, readable news content, which coherently integrates key news elements, poses a significant challenge. This technology's potential lies in reducing journalists' repetitive work, enhancing creative contributions. Common text generation models, however, often suffer from large parameter sizes and produce repetitive or inaccurate content. Addressing these challenges in news text generation, we present a lightweight model parameterization method, incorporating the BART (Bidirectional and Auto-Regressive Transformers) model with a pointer-generator network. Experimental results demonstrate that our model, using fewer parameters compared to baseline models, achieves a Rouge-L score of 57.6% on real-world news datasets, thereby demonstrating enhanced performance.
Gaduh HartawanDian Sa’adillah MaylawatiWisnu Uriawan
Xingyu MaSongfeng LuH LiuBingyan Feng