JOURNAL ARTICLE

GrammarT5: Grammar-Integrated Pretrained Encoder-Decoder Neural Model for Code

Abstract

Pretrained models for code have exhibited promising performance across various code-related tasks, such as code summarization, code completion, code translation, and bug detection. However, despite their success, the majority of current models still represent code as a token sequence, which may not adequately capture the essence of the underlying code structure.

Keywords:
Computer science Code (set theory) Automatic summarization Programming language Encoder Code generation Natural language processing Machine translation Artificial intelligence Security token

Metrics

8
Cited By
12.22
FWCI (Field Weighted Citation Impact)
23
Refs
0.97
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Software Engineering Research
Physical Sciences →  Computer Science →  Information Systems
Software Testing and Debugging Techniques
Physical Sciences →  Computer Science →  Software
Advanced Malware Detection Techniques
Physical Sciences →  Computer Science →  Signal Processing

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.