2023
DOI: 10.3934/era.2023115
|View full text |Cite
|
Sign up to set email alerts
|

Deep evidential learning in diffusion convolutional recurrent neural network

Abstract: <abstract><p>Graph neural networks (GNNs) is applied successfully in many graph tasks, but there still exists a limitation that many of GNNs model do not consider uncertainty quantification of its output predictions. For uncertainty quantification, there are mainly two types of methods which are frequentist and Bayesian. But both methods need to sampling to gradually approximate the real distribution, in contrast, evidential deep learning formulates learning as an evidence acquisition process, whic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
2

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…The use of graph neural networks (GNNs) is on the rise for analyzing graph structure data, as seen in recent research studies [7,12,14].…”
Section: Related Work 21 Graph Neural Networkmentioning
confidence: 99%
See 1 more Smart Citation
“…The use of graph neural networks (GNNs) is on the rise for analyzing graph structure data, as seen in recent research studies [7,12,14].…”
Section: Related Work 21 Graph Neural Networkmentioning
confidence: 99%
“…Graph Neural Networks (GNNs) [29], a powerful technology for learning knowledge from graph-structured data, are gaining increasing attention in today's world, where graph-structured data such as social networks [12,27], molecular structures [6,25], traffic flows [19,21,41,47], and knowledge graphs [32] are widely used. GNNs work by propagating and fusing messages from neighboring nodes on the graph using message-passing mechanisms.…”
Section: Introductionmentioning
confidence: 99%
“…Previous ERN methods (Amini et al 2020;Malinin et al 2020;Charpentier et al 2021;Oh and Shin 2022;Feng et al 2023;Mei et al 2023) use specific activation functions like ReLU to ensure non-negative values for parameters of the evidential distribution, such as the variance. Nevertheless, the utilization of such activation functions may inadvertently hinder ERN models' capacity to learn effectively from training samples, thereby impairing overall model performance (Pandey and Yu 2023).…”
Section: Introductionmentioning
confidence: 99%