Abstract

Many of the current sentence compression techniques attempt to produce a shortened form of a sentence by relying on syntactic structure such as dependency tree representations. While the performance of sentence compression has been improving, these approaches require a full parse of the sentence before performing sentence compression, making it difficult to perform compression in real time. In this paper, we examine the possibilities of performing incremental sentence compression using long short-term memory (LSTM) recurrent neural networks (RNN). The decision of whether to remove a word is done at each time step, without waiting for the end of the sentence. Various RNN parameters are investigated, including the number of layers and network connections. Furthermore, we also propose using a pretraining method in which the network is pretrained as an autoencoder. Experimental results reveal that our method obtains compression rates similar to human references and a better accuracy than the state-of-the-art tree transduction models.

Keywords:
Sentence Computer science Recurrent neural network Artificial intelligence Compression (physics) Autoencoder Parsing Word (group theory) Natural language processing Data compression Speech recognition Artificial neural network Mathematics

Metrics

8
Cited By
0.31
FWCI (Field Weighted Citation Impact)
42
Refs
0.81
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Speech Recognition and Synthesis
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

BOOK-CHAPTER

Deletion-Based Sentence Compression Using Bi-enc-dec LSTM

Dac-Viet LaiNguyễn Trường SơnLe-Minh Nguyen

Communications in computer and information science Year: 2018 Pages: 249-260
DISSERTATION

INCREMENTAL PREDICTION OF SENTENCE-FINAL VERBS WITH ATTENTIVE RECURRENT NEURAL NETWORKS

Wenyan Li

University:   University Libraries (University of Maryland) Year: 2018
JOURNAL ARTICLE

Handwriting generation using recurrent neural networks (LSTM)

Rabpreet Singh Keer, Rabpreet Singh Keer

Journal:   Zenodo (CERN European Organization for Nuclear Research) Year: 2023
© 2026 ScienceGate Book Chapters — All rights reserved.