Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1488
|View full text |Cite
|
Sign up to set email alerts
|

Heterogeneous Graph Attention Networks for Semi-supervised Short Text Classification

Abstract: Short text classification has found rich and critical applications in news and tweet tagging to help users find relevant information. Due to lack of labeled training data in many practical use cases, there is a pressing need for studying semi-supervised short text classification. Most existing studies focus on long texts and achieve unsatisfactory performance on short texts due to the sparsity and limited labeled data. In this paper, we propose a novel heterogeneous graph neural network based method for semi-s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
118
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 244 publications
(118 citation statements)
references
References 23 publications
0
118
0
Order By: Relevance
“…Tu et al (2019) introduced a heterogeneous graph neural network to encode documents, entities and candidates together for multihop reading comprehension. Linmei et al (2019) focused on semi-supervised short text classification and constructed a topic-entity heterogeneous neural graph.…”
Section: Related Workmentioning
confidence: 99%
“…Tu et al (2019) introduced a heterogeneous graph neural network to encode documents, entities and candidates together for multihop reading comprehension. Linmei et al (2019) focused on semi-supervised short text classification and constructed a topic-entity heterogeneous neural graph.…”
Section: Related Workmentioning
confidence: 99%
“…Li et al proposed [25] a two-level attention networks to identify the sentiment of short text, both local and long-distance dependent features are captured simultaneously by the attention mechanism, and then the attention-based features are utilized to captures more relevant features. Hu et al proposed [26] a heterogeneous graph neural network for semi-supervised short text classification. In this method, a dual-level attention mechanism which include node-level and type-level attention is utilized to learn the importance of neighbour node and different types to a current node.…”
Section: B Attention Based Methodsmentioning
confidence: 99%
“…GNN has received increasing interests for its strong capability of encoding structural information (Kipf and Welling, 2016;Bastings et al, 2017;. GAT is one representative model, which demonstrates success in a number of NLP tasks (Huang and Carley, 2019;Linmei et al, 2019). In this work, we exploit GAT to represent treestructural information for DRTS parsing.…”
Section: Related Workmentioning
confidence: 99%
“…We propose to improve DRTS parsing by making use of the above structure information, modeling dependency-based syntax of the input sentences as well as the skeleton structure to enhance the baseline model of Liu et al (2019) using Graph Attention Network (GAT) (Veličković et al, 2018), which has been demonstrated effective for tree/graph encoding (Huang and Carley, 2019;Linmei et al, 2019). In particular, we first derive dependency tree structures for each sentence in a paragraph from the Stanford Parser, and then encode them directly via one GAT module, which are fed as inputs for decoding.…”
Section: Introductionmentioning
confidence: 99%