Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics and the 7th International Joint Confere 2015
DOI: 10.3115/v1/p15-2053
|View full text |Cite
|
Sign up to set email alerts
|

Chinese Zero Pronoun Resolution: A Joint Unsupervised Discourse-Aware Model Rivaling State-of-the-Art Resolvers

Abstract: We propose an unsupervised probabilistic model for zero pronoun resolution. To our knowledge, this is the first such model that (1) is trained on zero pronouns in an unsupervised manner; (2) jointly identifies and resolves anaphoric zero pronouns; and (3) exploits discourse information provided by a salience model. Experiments demonstrate that our unsupervised model significantly outperforms its state-of-the-art unsupervised counterpart when resolving the Chinese zero pronouns in the OntoNotes corpus.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
14
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 26 publications
(14 citation statements)
references
References 13 publications
0
14
0
Order By: Relevance
“…The integer linear programming approach was then employed to enhance the performance of the ranking model. They [9] further proposed an end-to-end unsupervised probabilistic model for Chinese zero pronoun resolution task. They used a salience model to capture discourse information.…”
Section: Zero Pronoun Resolutionmentioning
confidence: 99%
“…The integer linear programming approach was then employed to enhance the performance of the ranking model. They [9] further proposed an end-to-end unsupervised probabilistic model for Chinese zero pronoun resolution task. They used a salience model to capture discourse information.…”
Section: Zero Pronoun Resolutionmentioning
confidence: 99%
“…Five recent zero pronoun resolution systems are employed as our baselines, namely, Zhao and Ng (2007), Chen and Ng (2015), Chen and Ng (2016), Yin et al (2017a) and Yin et al (2017b). The first of them is machine learning-based, the second is the unsupervised and the other ones are all deep learning models.…”
Section: Baselines and Experiments Settingsmentioning
confidence: 99%
“…They first recover each ZP into ten overt pronouns and then apply a ranking model to rank the antecedents. Chen and Ng (2015) propose an end-to-end unsupervised probabilistic model, utilizing a salience model to capture discourse information. In recent years, Chen and Ng (2016) develop a deep neural network approach to learn useful task-specific representations and effectively exploit lexical features through word embeddings.…”
Section: Zero Pronoun Resolutionmentioning
confidence: 99%