JOURNAL ARTICLE

Paraphrase Generation Model Integrating Transformer Architecture, Part-of-Speech Features, and Pointer Generator Network

Yu-Chia TsaiFeng‐Cheng Lin

Year: 2023 Journal:   IEEE Access Vol: 11 Pages: 30109-30117   Publisher: Institute of Electrical and Electronics Engineers

Abstract

In recent years, hardware advancements have enabled natural language processing tasks that were previously difficult to achieve due to their intense computing requirements. This study focuses on paraphrase generation, which entails rewriting a sentence using different words and sentence structures while preserving its original meaning. This increases sentence diversity, thereby improving the performance of downstream tasks, such as question–answering systems and machine translation. This study proposes a novel paraphrase generation model that combines the Transformer architecture with part-of-speech features, and this model is trained using a Chinese corpus. New features are incorporated to improve the performance of the Transformer architecture, and the pointer generation network is used when the training data contain low-frequency words. This allows the model to focus on input words with important information according to their attention distributions.

Keywords:
Computer science Paraphrase Transformer Machine translation Natural language processing Pointer (user interface) Sentence Artificial intelligence Architecture Question answering Rewriting Speech recognition Programming language

Metrics

6
Cited By
1.53
FWCI (Field Weighted Citation Impact)
26
Refs
0.81
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
© 2026 ScienceGate Book Chapters — All rights reserved.