BOOK-CHAPTER

Dependency Parsing

Ruket Çakıcı

Year: 2011 Machine Learning Pages: 2117-2124   Publisher: Springer Science+Business Media

Abstract

Annotated data have recently become more important, and thus more abundant, in computational linguistics . They are used as training material for machine learning systems for a wide variety of applications from Parsing to Machine Translation (Quirk et al., 2005). Dependency representation is preferred for many languages because linguistic and semantic information is easier to retrieve from the more direct dependency representation. Dependencies are relations that are defined on words or smaller units where the sentences are divided into its elements called heads and their arguments, e.g. verbs and objects. Dependency parsing aims to predict these dependency relations between lexical units to retrieve information, mostly in the form of semantic interpretation or syntactic structure. Parsing is usually considered as the first step of Natural Language Processing (NLP). To train statistical parsers, a sample of data annotated with necessary information is required. There are different views on how informative or functional representation of natural language sentences should be. There are different constraints on the design process such as: 1) how intuitive (natural) it is, 2) how easy to extract information from it is, and 3) how appropriately and unambiguously it represents the phenomena that occur in natural languages. In this article, a review of statistical dependency parsing for different languages will be made and current challenges of designing dependency treebanks and dependency parsing will be discussed.

Keywords:
Computer science Natural language processing Parsing Artificial intelligence Dependency (UML) Dependency grammar Machine translation S-attributed grammar Representation (politics) Bottom-up parsing Semantic interpretation Natural language Syntactic predicate Top-down parsing

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
35
Refs
0.33
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Natural Language Processing Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Semantic Web and Ontologies
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Dependency Parsing

Sandra KüblerRyan McDonaldJoakim Nivre

Journal:   Synthesis lectures on human language technologies Year: 2009 Vol: 2 (1)Pages: 1-127
BOOK-CHAPTER

Dependency Parsing

Pierre Nugues

Cognitive technologies Year: 2014 Pages: 403-437
BOOK

Dependency Parsing

Sandra KüblerRyan McDonaldJoakim Nivre

Synthesis lectures on human language technologies Year: 2009
BOOK-CHAPTER

Dependency Parsing

Ruket Çakıcı

IGI Global eBooks Year: 2009 Pages: 449-455
JOURNAL ARTICLE

Dependency Parsing

Joakim Nivre

Journal:   Language and Linguistics Compass Year: 2010 Vol: 4 (3)Pages: 138-152
© 2026 ScienceGate Book Chapters — All rights reserved.