Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Confer 2021
DOI: 10.18653/v1/2021.acl-long.215
|View full text |Cite
|
Sign up to set email alerts
|

Interpretable and Low-Resource Entity Matching via Decoupling Feature Learning from Decision Making

Abstract: Entity Matching (EM) aims at recognizing entity records that denote the same real-world object. Neural EM models learn vector representation of entity descriptions and match entities end-to-end. Though robust, these methods require many annotated resources for training, and lack of interpretability. In this paper, we propose a novel EM framework that consists of Heterogeneous Information Fusion (HIF) and Key Attribute Tree (KAT) Induction to decouple feature representation from matching decision. Using self-su… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 30 publications
0
6
0
Order By: Relevance
“…Mojito [22] Model-specific LightEA [59], HIF+KAT [130], GMKSLEM [21] XTransE [137], CPM [99], Kelpie [81], CrossE [138], KGInfluence [143], GINN [40], SNS [39], approxSemanticCrossE [18] Local self-explaining TMN [52], BTPK [15], AutoTriggER [49], Instance-based [68] [91], D-REX [2], SIRE [134], SAIS [125], NERO [141], DISCO-RE [111],…”
Section: Entity Extraction Relation Extraction Entity Resolution Link...mentioning
confidence: 99%
“…Mojito [22] Model-specific LightEA [59], HIF+KAT [130], GMKSLEM [21] XTransE [137], CPM [99], Kelpie [81], CrossE [138], KGInfluence [143], GINN [40], SNS [39], approxSemanticCrossE [18] Local self-explaining TMN [52], BTPK [15], AutoTriggER [49], Instance-based [68] [91], D-REX [2], SIRE [134], SAIS [125], NERO [141], DISCO-RE [111],…”
Section: Entity Extraction Relation Extraction Entity Resolution Link...mentioning
confidence: 99%
“…Early works focus on engineering matching rules along with sophisticated comparison features. Recent study suggest that the attention architecture [124,125] and the pre-training scheme [126,127] may serve as an important role in entity matching.…”
Section: Knowledge Graph Completion and Integrationmentioning
confidence: 99%
“…(2) The semantic representations consider the embedding vectors that encapsulate a textual value, leveraging word-, characteror transformer-based models. In this work, we exclusively consider the unsupervised, pre-trained embeddings of fast-Text [8] that have been experimentally verified to effectively address the out-of-vocabulary cases in ER tasks, which abound in domain-specific terminology [25,46,59,64,66].…”
Section: Qualitative Analysismentioning
confidence: 99%