Proceedings of the 2009 Conference on Empirical Methods in Natural Language Processing Volume 2 - EMNLP '09 2009
DOI: 10.3115/1699571.1699585
|View full text |Cite
|
Sign up to set email alerts
|

An empirical study of semi-supervised structured conditional models for dependency parsing

Abstract: This paper describes an empirical study of high-performance dependency parsers based on a semi-supervised learning approach. We describe an extension of semisupervised structured conditional models (SS-SCMs) to the dependency parsing problem, whose framework is originally proposed in (Suzuki and Isozaki, 2008). Moreover, we introduce two extensions related to dependency parsing: The first extension is to combine SS-SCMs with another semi-supervised approach, described in (Koo et al., 2008). The second extensio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
31
0

Year Published

2010
2010
2017
2017

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 37 publications
(31 citation statements)
references
References 13 publications
0
31
0
Order By: Relevance
“…Recently, semi-supervised classifiers have been proposed based on hybrid generative/discriminative models [18,14,28,29,1]. As reported in [19], generative models unsuitable for real data samples provide worse classification performance than discriminative models.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, semi-supervised classifiers have been proposed based on hybrid generative/discriminative models [18,14,28,29,1]. As reported in [19], generative models unsuitable for real data samples provide worse classification performance than discriminative models.…”
Section: Introductionmentioning
confidence: 99%
“…In our formulation, classifiers are constructed by combining discriminative and generative models, which is similar to a previous semi-supervised classification framework, namely the Joint probability model Embedding style SemiSupervised Conditional Model (JESS-CM), which substantially improved the performance of natural language processing (NLP) tasks [28,29]. The proposed method provides a classifier training algorithm improved to deal with the difference between source and target distributions.…”
Section: Introductionmentioning
confidence: 99%
“…However, their parser is a joint model of parsing and POS-tagging, and they use external data in parsing. We list the result of , Koo et al (2008) and Suzuki et al (2009) in Table 7, which make use of large-scale unannotated text to improve parsing accuracies. The input embeddings of our parser are also trained over large raw text, and in this perspective our model is correlated with the semi-supervised models.…”
Section: Chunkingmentioning
confidence: 99%
“…Koo et al (2008) and Suzuki et al (2009) use unsupervised wordclusters as features in a dependency parser to get lexical dependencies. This has some notional similarity to categories, since, like categories, clusters are less fine-grained than words but more finegrained than POS-tags.…”
Section: Related Workmentioning
confidence: 99%