JOURNAL ARTICLE

Compression of Generative Pre-trained Language Models via Quantization

Chaofan TaoLu HouWei ZhangLifeng ShangXin JiangQun LiuPing LuoNgai Wong

Year: 2022 Journal:   Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) Pages: 4821-4836

Abstract

Chaofan Tao, Lu Hou, Wei Zhang, Lifeng Shang, Xin Jiang, Qun Liu, Ping Luo, Ngai Wong. Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers). 2022.

Keywords:
Ping (video games) Zhàng Generative grammar Computer science Natural language processing Quantization (signal processing) Speech recognition Linguistics Artificial intelligence Cognitive science Philosophy Algorithm Psychology History Archaeology

Metrics

42
Cited By
4.94
FWCI (Field Weighted Citation Impact)
39
Refs
0.96
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Speech Recognition and Synthesis
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.