2019
DOI: 10.1609/aaai.v33i01.33017370
|View full text |Cite
|
Sign up to set email alerts
|

Graph Convolutional Networks for Text Classification

Abstract: Text classification is an important and classical problem in natural language processing. There have been a number of studies that applied convolutional neural networks (convolution on regular grid, e.g., sequence) to classification. However, only a limited number of studies have explored the more flexible graph convolutional neural networks (convolution on non-grid, e.g., arbitrary graph) for the task. In this work, we propose to use graph convolutional networks for text classification. We build a single text… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

11
1,020
1
5

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 1,607 publications
(1,037 citation statements)
references
References 12 publications
11
1,020
1
5
Order By: Relevance
“…Graph Convolutional Networks The second component of our solution is Graph Convolutional Networks (GCNs). Recently, GCNs have achieved immense success in capturing the underlying associations and correlations between entities and have been widely adapted to various domains such as computer vision [7] and natural language processing [12,13]. However, to the best of our knowledge, GCNs have never been explored in audio domain.…”
Section: Methodsmentioning
confidence: 99%
“…Graph Convolutional Networks The second component of our solution is Graph Convolutional Networks (GCNs). Recently, GCNs have achieved immense success in capturing the underlying associations and correlations between entities and have been widely adapted to various domains such as computer vision [7] and natural language processing [12,13]. However, to the best of our knowledge, GCNs have never been explored in audio domain.…”
Section: Methodsmentioning
confidence: 99%
“…Some other methods like [27] regard documents or sentences as the graphs of words. Differently, Yao et.al [26] propose a new way to construct the graph by regarding both documents and words as nodes, which performs quite well with GCN.…”
Section: Dual-attenmentioning
confidence: 99%
“…The whole architecture of the proposed dual-attention GCN framework is shown in Fig.1, where the input is a graph based on a given text. For graph construction, we adopt the method proposed in [26] which removes useless words in texts first and then models both the text and its words as nodes. This process is described in detail in Section 3.2.…”
Section: Overview Of Dual-attention Gcnmentioning
confidence: 99%
See 1 more Smart Citation
“…This trend has been increasing especially since the advent of Graph Neural Networks [4] and contextual Neural Networks for Graphs [5], which paved the road for modern graph-based Deep Learning [6] models. As of today, Graph Neural Networks are used with success for predictive tasks such as semi-supervised classification [7], link prediction [8], and text classification [9].…”
Section: Introductionmentioning
confidence: 99%