DISSERTATION

Incremental generative models for syntactic and semantic natural language processing

Jan Buys

Year: 2017 University:   Oxford University Research Archive (ORA) (University of Oxford)   Publisher: University of Oxford

Abstract

This thesis investigates the role of linguistically-motivated generative models of syntax and semantic structure in natural language processing (NLP). Syntactic well-formedness is crucial in language generation, but most statistical models do not account for the hierarchical structure of sentences. Many applications exhibiting natural language understanding rely on structured semantic representations to enable querying, inference and reasoning. Yet most semantic parsers produce domain-specific or inadequately expressive representations. We propose a series of generative transition-based models for dependency syntax which can be applied as both parsers and language models while being amenable to supervised or unsupervised learning. Two models are based on Markov assumptions commonly made in NLP: The first is a Bayesian model with hierarchical smoothing, the second is parameterised by feed-forward neural networks. The Bayesian model enables careful analysis of the structure of the conditioning contexts required for generative parsers, but the neural network is more accurate. As a language model the syntactic neural model outperforms both the Bayesian model and n-gram neural networks, pointing to the complementary nature of distributed and structured representations for syntactic prediction. We propose approximate inference methods based on particle filtering. The third model is parameterised by recurrent neural networks (RNNs), dropping the Markov assumptions. Exact inference with dynamic programming is made tractable here by simplifying the structure of the conditioning contexts. We then shift the focus to semantics and propose models for parsing sentences to labelled semantic graphs. We introduce a transition-based parser which incrementally predicts graph nodes (predicates) and edges (arguments). This approach is contrasted against predicting top-down graph traversals. RNNs and pointer networks are key components in approaching graph parsing as an incremental prediction problem. The RNN architecture is augmented to condition the model explicitly on the transition system configuration. We develop a robust parser for Minimal Recursion Semantics, a linguistically-expressive framework for compositional semantics which has previously been parsed only with grammar-based approaches. Our parser is much faster than the grammar-based model, while the same approach improves the accuracy of neural Abstract Meaning Representation parsing.

Keywords:
Computer science Artificial intelligence Parsing Natural language processing Generative model Inference Syntax Language model Generative grammar Dependency grammar Recurrent neural network Artificial neural network

Metrics

1
Cited By
0.00
FWCI (Field Weighted Citation Impact)
157
Refs
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Speech and dialogue systems
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Processing natural language syntactic and semantic mechanisms

W. D. HagamenP£ter BerryKenneth E. IversonJohn C. Weber

Journal:   ACM SIGAPL APL Quote Quad Year: 1989 Vol: 19 (4)Pages: 184-189
DISSERTATION

Deep generative models for natural language processing

Yishu Miao

University:   Oxford University Research Archive (ORA) (University of Oxford) Year: 2017
JOURNAL ARTICLE

Syntactic parsing with generative language models

Albrecht, Marlene

Journal:   University of Vienna Year: 2025
JOURNAL ARTICLE

Stellenwert von Natural Language Processing und chatbasierten Generative Language Models

Markus HaarMichael SonntagbauerStefan Kluge

Journal:   Medizinische Klinik - Intensivmedizin und Notfallmedizin Year: 2023 Vol: 119 (3)Pages: 181-188
© 2026 ScienceGate Book Chapters — All rights reserved.