2023
DOI: 10.3390/electronics12122692
|View full text |Cite
|
Sign up to set email alerts
|

Entity Linking Method for Chinese Short Texts with Multiple Embedded Representations

Abstract: Entity linking, a crucial task in the realm of natural language processing, aims to link entity mentions in a text to their corresponding entities in the knowledge base. While long documents provide abundant contextual information, facilitating feature extraction for entity identification and disambiguation, entity linking in Chinese short texts presents significant challenges. This study introduces an innovative approach to entity linking within Chinese short texts, combining multiple embedding representation… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 31 publications
0
2
0
Order By: Relevance
“…Zhang et al [10] utilized features similar to those of twin networks to deeply analyze semantic relationships in texts and fully utilized the feature information of the texts to be disambiguated. Shi et al [11] combined multiple embedding representations for entity linking in Chinese short texts to improve the performance of entity linking.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Zhang et al [10] utilized features similar to those of twin networks to deeply analyze semantic relationships in texts and fully utilized the feature information of the texts to be disambiguated. Shi et al [11] combined multiple embedding representations for entity linking in Chinese short texts to improve the performance of entity linking.…”
Section: Related Workmentioning
confidence: 99%
“…Consequently, these methods struggle to effectively capture text features in short-text scenarios, resulting in a substantial decline in disambiguation performance. Additionally, current entity disambiguation approaches, especially for short-text disambiguation [10,11], heavily rely on a large volume of training samples to ensure model generalization and disambiguation effectiveness. However, acquiring sufficient data and conducting model training for various tasks can be costly in real-world applications.…”
Section: Introductionmentioning
confidence: 99%