JOURNAL ARTICLE

Exact training of a neural syntactic language model

Abstract

The structured language model (SLM) aims at predicting the next word in a given word string by making a syntactical analysis of the preceding words. However, it faces the data sparseness problem because of the large dimensionality and diversity of the information available in the syntactic parsing. Previously, we proposed using neural network models for the SLM (Emami, A. et al., Proc. ICASSP, 2003; Emami, Proc. EUROSPEECH'03., 2003). The neural network model is better suited to tackle the data sparseness problem and its use gave significant improvements in perplexity and word error rate over the baseline SLM. We present a new method of training the neural net based SLM. This procedure makes use of the partial parsing hypothesized by the SLM itself, and is more expensive than the approximate training method used previously. Experiments with the new training method on the UPenn and WSJ corpora show significant reductions in perplexity and word error rate, achieving the lowest published results for the given corpora.

Keywords:
Perplexity Computer science Parsing Word (group theory) Artificial neural network Word error rate Artificial intelligence String (physics) Language model Natural language processing Baseline (sea) Speech recognition Linguistics Mathematics

Metrics

22
Cited By
1.93
FWCI (Field Weighted Citation Impact)
8
Refs
0.89
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Speech Recognition and Synthesis
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

A Neural Syntactic Language Model

Ahmad EmamiFrederick Jelinek

Journal:   Machine Learning Year: 2005 Vol: 60 (1-3)Pages: 195-227
JOURNAL ARTICLE

Syntactic priming in the L2 neural language model

Sunjoo ChoiMyung-Kwan Park

Journal:   The Journal of Linguistics Science Year: 2022 Vol: 103 Pages: 81-104
JOURNAL ARTICLE

Gradual Syntactic Label Replacement for Language Model Pre-Training

Yile WangYue ZhangPeng LiYang Liu

Journal:   IEEE/ACM Transactions on Audio Speech and Language Processing Year: 2023 Vol: 32 Pages: 486-496
JOURNAL ARTICLE

Implicit and Explicit Second Language Training Recruit Common Neural Mechanisms for Syntactic Processing

Laura BatterinkHelen J. Neville

Journal:   Journal of Cognitive Neuroscience Year: 2013 Vol: 25 (6)Pages: 936-951
© 2026 ScienceGate Book Chapters — All rights reserved.