Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1586
|View full text |Cite
|
Sign up to set email alerts
|

Low-resource Deep Entity Resolution with Transfer and Active Learning

Abstract: Entity resolution (ER) is the task of identifying different representations of the same real-world entities across databases. It is a key step for knowledge base creation and text mining. Recent adaptation of deep learning methods for ER mitigates the need for dataset-specific feature engineering by constructing distributed representations of entity records. While these methods achieve stateof-the-art performance over benchmark data, they require large amounts of labeled data, which are typically unavailable i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
98
0
1

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 108 publications
(99 citation statements)
references
References 34 publications
(40 reference statements)
0
98
0
1
Order By: Relevance
“…In addition we show the best reported result found in related work from matching systems using supervised learning while matching systems that use other types of learning (e.g. active learning or semisupervised learning) are excluded from this comparison [13,18,22]. The interpretation of the comparison to the results reported in related work should be made with attention to the differing, unfixed train, optimization, and test sets.…”
Section: Baseline Resultsmentioning
confidence: 91%
“…In addition we show the best reported result found in related work from matching systems using supervised learning while matching systems that use other types of learning (e.g. active learning or semisupervised learning) are excluded from this comparison [13,18,22]. The interpretation of the comparison to the results reported in related work should be made with attention to the differing, unfixed train, optimization, and test sets.…”
Section: Baseline Resultsmentioning
confidence: 91%
“…Recent literature on Machine Learning applied to RL includes Aiken et al (2019), that compares probabilistic, stochastic and machine learning approaches, showing that supervised methods outperform unsupervised ones; Dong & Rekatsinas (2018), that surveys state-of-the-art data integration solutions based on Machine Learning with a discussion of open research challenges; Kasai et al (2019), that leverages Deep Learning in a combination of Transfer and Active Learning aiming to save labeled data up to an order of magnitude; Di Cicco et al (2019), that presents an attempt of explainable Deep Learning exploiting LIME, a popular tool for prediction explanations in classification; Hou et al (2019), that propose a paradigm called ''gradual machine learning'' where data are labeled automatically through iterative factor graph inference, starting with the easiest instances and going up to the hardest ones.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, deep learning, in particular, has received increasing attention [3,8,11]. The community also see the need for interactively querying the user for examples, and so exciting work has been done within active learning [5,7].…”
Section: Background and State Of The Artmentioning
confidence: 99%