JOURNAL ARTICLE

Semi-Supervised Semantic Dependency Parsing Using CRF Autoencoders

Abstract

Semantic dependency parsing, which aims to find rich bi-lexical relationships, allows words to have multiple dependency heads, resulting in graph-structured representations. We propose an approach to semi-supervised learning of semantic dependency parsers based on the CRF autoencoder framework. Our encoder is a discriminative neural semantic dependency parser that predicts the latent parse graph of the input sentence. Our decoder is a generative neural model that reconstructs the input sentence conditioned on the latent parse graph. Our model is arc-factored and therefore parsing and learning are both tractable. Experiments show our model achieves significant and consistent improvement over the supervised baseline.

Keywords:
Computer science Dependency grammar Artificial intelligence Parsing Natural language processing Sentence Autoencoder Graph Discriminative model Dependency (UML) Generative grammar Encoder Semantic role labeling Deep learning Theoretical computer science

Metrics

10
Cited By
1.32
FWCI (Field Weighted Citation Impact)
35
Refs
0.83
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Machine Learning in Bioinformatics
Life Sciences →  Biochemistry, Genetics and Molecular Biology →  Molecular Biology
© 2026 ScienceGate Book Chapters — All rights reserved.