2021
DOI: 10.1109/access.2021.3096845
|View full text |Cite
|
Sign up to set email alerts
|

Graph Neural Networks Using Local Descriptions in Attributed Graphs: An Application to Symbol Recognition and Hand Written Character Recognition

Abstract: Graph-based methods have been widely used by the document image analysis and recognition community, as the different objects and the content in document images is best represented by this powerful structural representation. Designing of novel computation tools for processing these graph-based structural representations has always remained a hot topic of research. Recently, Graph Neural Network (GNN) have been used for solving different problems in the domain of document image analysis and recognition. In this … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(3 citation statements)
references
References 25 publications
0
2
0
Order By: Relevance
“…P a t h s S t r a t i f i e d P a t h s C l i q u e s S t r a t i f i e d C l i q u e s In Table 3, we finally show a brief comparison against current approaches for graph classification. Competitors span a variety of techniques, including classifiers working on the top of GEDs [49,67], kernel methods [68][69][70][71] and several embedding techniques [68,72,73], including Granular Computing-based [32,74] and those based on neural networks and deep learning [75][76][77][78][79]. We can see that our method has comparable performances against current approaches in the graph classification literature.…”
Section: Computational Resultsmentioning
confidence: 80%
“…P a t h s S t r a t i f i e d P a t h s C l i q u e s S t r a t i f i e d C l i q u e s In Table 3, we finally show a brief comparison against current approaches for graph classification. Competitors span a variety of techniques, including classifiers working on the top of GEDs [49,67], kernel methods [68][69][70][71] and several embedding techniques [68,72,73], including Granular Computing-based [32,74] and those based on neural networks and deep learning [75][76][77][78][79]. We can see that our method has comparable performances against current approaches in the graph classification literature.…”
Section: Computational Resultsmentioning
confidence: 80%
“…represents the number of master information nodes after aggregation, d represents the initial embedded dimension, and the generated dynamic adjacency matrix is defined as: = ∈ 3×3 (13) A local graph convolution method is proposed in document [33] to learn the information and distance of the graph and describe the graph locally through the node and edge information of the graph. In this paper, the graph convolution is defined in the vertex domain, omitting the edge information, so it is not necessary to calculate the Laplace determinant of the graph, and each convolution is a global description of the graph.…”
Section: B Information Aggregation Modulementioning
confidence: 99%
“…GAT and MoNet extend GCN by leveraging an explicit attention mechanism ( Lee et al, 2019 ). Due to powerful represent capabilities, GNNs have been applied into a wide range of applications including knowledge graphs ( Zhang, Cui & Zhu, 2020 ), molecular graph generation ( De Cao & Kipf, 2018 ), graph metric learning and image recognition ( Kajla et al, 2021 ; Riba et al, 2021 ). Recently, graph sampling was investigated in GNNs for scaling to larger graphs and better generalization.…”
Section: Related Workmentioning
confidence: 99%