2022
DOI: 10.3390/fi14030085
|View full text |Cite
|
Sign up to set email alerts
|

Addressing Syntax-Based Semantic Complementation: Incorporating Entity and Soft Dependency Constraints into Metonymy Resolution

Abstract: State-of-the-art methods for metonymy resolution (MR) consider the sentential context by modeling the entire sentence. However, entity representation, or syntactic structure that are informative may be beneficial for identifying metonymy. Other approaches only using deep neural network fail to capture such information. To leverage both entity and syntax constraints, this paper proposes a robust model EBAGCN for metonymy resolution. First, this work extracts syntactic dependency relations under the guidance of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 38 publications
0
1
0
Order By: Relevance
“…The results of their experiments achieve the state of the art with 89.1%, 94.8% and 95.9% in SemEval, ReLocaR and WIMCOR respectively. Du and Wang [41] state that the state-of-the-art methods only focus on the context of the sentence neglecting Entity representation and Syntactic structure, their study is fixated on exploiting entity and syntax constraints by obtaining syntactic dependency relations, then developing a neural network that can integrate both constraints for a better representation, thus, enabling the model's performance to surpass the limitations when encountering complex sentences along with reducing the noise. The experiments on SemEval and ReLocaR datasets display a considerable improvement where it scored 89.8% and 95.7% respectively which is over 4% compared to the BERT model [22].…”
Section: ) Research After the Semevalmentioning
confidence: 99%
“…The results of their experiments achieve the state of the art with 89.1%, 94.8% and 95.9% in SemEval, ReLocaR and WIMCOR respectively. Du and Wang [41] state that the state-of-the-art methods only focus on the context of the sentence neglecting Entity representation and Syntactic structure, their study is fixated on exploiting entity and syntax constraints by obtaining syntactic dependency relations, then developing a neural network that can integrate both constraints for a better representation, thus, enabling the model's performance to surpass the limitations when encountering complex sentences along with reducing the noise. The experiments on SemEval and ReLocaR datasets display a considerable improvement where it scored 89.8% and 95.7% respectively which is over 4% compared to the BERT model [22].…”
Section: ) Research After the Semevalmentioning
confidence: 99%