JOURNAL ARTICLE

Distillation of Large Language Models for Text Simplification

Олександр Скуржанський

Year: 2023 Journal:   Modeling Control and Information Technologies Pages: 230-231

Abstract

This work presents a comprehensive methodology for harnessing the capabilities of Large Language Models to address specific Natural Language Processing tasks, with a focus on Text Simplification. While LLMs have demonstrated their prowess in tackling a wide range of NLP challenges, their demanding computational requirements can render them impractical for real-time online inference. In response to this limitation, we suggest the concept of text distillation, a technique aimed at effectively transferring the knowledge stored within LLMs to more compact and computationally efficient neural networks.

Keywords:
Computer science Inference Focus (optics) Artificial intelligence Distillation Artificial neural network Natural language processing Work (physics) Natural language Machine learning Data science Engineering

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
9
Refs
0.16
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Text Readability and Simplification
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.