2019
DOI: 10.48550/arxiv.1910.02356
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Text Level Graph Neural Network for Text Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 32 publications
(36 citation statements)
references
References 17 publications
0
36
0
Order By: Relevance
“…We discuss related works on the theory and applications of GNNs. There exist general GNN surveys [48], [49], [58], [100], [223], [241], [244], [251], works on theoretical aspects (spatial-spectral dichotomy [11], [63], the expressive power of GNNs [178], or heterogeneous graphs [224], [229]), analyzes of GNNs for specific applications (knowl-edge graph completion [5], traffic forecasting [119], [195], symbolic computing [137], recommender systems [220], text classification [115], or action recognition [3]), explainability of GNNs [232], and on software (SW) and hardware (HW) accelerators and SW/HW co-design [1]. We complement these works as we focus on parallelism and distribution of GNN workloads.…”
Section: Complementary Analysesmentioning
confidence: 99%
“…We discuss related works on the theory and applications of GNNs. There exist general GNN surveys [48], [49], [58], [100], [223], [241], [244], [251], works on theoretical aspects (spatial-spectral dichotomy [11], [63], the expressive power of GNNs [178], or heterogeneous graphs [224], [229]), analyzes of GNNs for specific applications (knowl-edge graph completion [5], traffic forecasting [119], [195], symbolic computing [137], recommender systems [220], text classification [115], or action recognition [3]), explainability of GNNs [232], and on software (SW) and hardware (HW) accelerators and SW/HW co-design [1]. We complement these works as we focus on parallelism and distribution of GNN workloads.…”
Section: Complementary Analysesmentioning
confidence: 99%
“…Phan et al [24] is the first work that has attempted to apply Graph Neural Networks classification for predicting the range of story points of the Deep-SE dataset. They rely on the graph construction and training process of the TextLevelGNN engine [14] to solve the problem of estimation. Although TextLevelGNN is a well-known homogeneous GNN model in NLP research, Phan et al [24] show that the original GNN models can cause negative impacts on estimating story points due to the large size of vocabulary and number of edges for graph construction.…”
Section: Rq4: Can Heterosp's Graphs Support Homogeneous Gnn In Effort...mentioning
confidence: 99%
“…In recent years, increasing attention and efforts have been devoted into graph neural networks, which successfully extend deep neural networks to graph data. Graph neural networks are theoretically and empirically demonstrated to be very powerful in graph representation learning [15,26,43], and have achieved great success in various applications from different domains, such as natural language processing [6,22,53], computer vision [36,47] and recommendation [18,24,46,49,52,54,60]. There are mainly two groups of graph neural networks: the spectral-based methods and the spatial-based methods.…”
Section: Graph Neural Networkmentioning
confidence: 99%