2010
DOI: 10.1007/978-90-481-9352-3_3
|View full text |Cite
|
Sign up to set email alerts
|

A Latent Variable Model for Generative Dependency Parsing

Abstract: We propose a generative dependency parsing model which uses binary latent variables to induce conditioning features. To define this model we use a recently proposed class of Bayesian Networks for structured prediction, Incremental Sigmoid Belief Networks. We demonstrate that the proposed model achieves state-of-the-art results on three different languages. We also demonstrate that the features induced by the ISBN's latent variables are crucial to this success, and show that the proposed model is particularly g… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
54
0

Year Published

2010
2010
2017
2017

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 37 publications
(54 citation statements)
references
References 24 publications
0
54
0
Order By: Relevance
“…From the perspective of applying deep networks in natural language processing systems, there are a number of works in the literature (Collobert and Weston, 2008;Collobert et al, 2011;Henderson, 2004;Socher et al, 2011;Titov and Henderson, 2010;Turian et al, 2010). Socher et al (2011) applied recursive autoencoders to address sentencelevel sentiment classification problems.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…From the perspective of applying deep networks in natural language processing systems, there are a number of works in the literature (Collobert and Weston, 2008;Collobert et al, 2011;Henderson, 2004;Socher et al, 2011;Titov and Henderson, 2010;Turian et al, 2010). Socher et al (2011) applied recursive autoencoders to address sentencelevel sentiment classification problems.…”
Section: Related Workmentioning
confidence: 99%
“…Henderson (2004) proposed discriminative training methods for learning a neural network statistical parser. Titov and Henderson (2010) extended the incremental sigmoid Belief networks (Titov and Henderson, 2007) to a generative latent variable model for dependency parsing. Turian et al (2010) employed neural networks to induce word representations for sequence labeling tasks such as named entity recognition.…”
Section: Related Workmentioning
confidence: 99%
“…Like the previous version of this parser (Titov and Henderson, 2007b), it uses a recurrent neural network (RNN) to predict the actions for a fast shift-reduce dependency parser. Decoding is done with a beam search where pruning occurs after each shift action.…”
Section: Parsermentioning
confidence: 99%
“…The system described in this paper is the grandchild of the first transition-based neural network dependency parser (Titov and Henderson, 2007b), which was the University of Geneva's entry in the CoNLL 2007 multilingual dependency parsing shared task (Titov and Henderson, 2007a). The system has undergone some developments and modifications, in particular the faster discriminative version introduced by Yazdani and Henderson (2015), but in many respects the design and implementation of this parser is unchanged since 2007.…”
Section: Introductionmentioning
confidence: 99%
“…The method was pioneered by Kudo and Matsumoto (2002) and Yamada and Matsumoto (2003) and has since been developed by a large number of researchers (Nivre et al, 2004;Attardi, 2006;Sagae and Tsujii, 2008;Titov and Henderson, 2007;Zhang and Clark, 2008). Similar techniques had previously been explored in other parsing frameworks by Briscoe and Carroll (1993) and Ratnaparkhi (1997), among others.…”
Section: Introductionmentioning
confidence: 99%