2022
DOI: 10.1109/tnnls.2021.3090503
|View full text |Cite
|
Sign up to set email alerts
|

Multiresolution Reservoir Graph Neural Network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 23 publications
(10 citation statements)
references
References 25 publications
0
10
0
Order By: Relevance
“…The second category delves into the exploration of novel model designs and applications of RC, with an aim to enhance computational performance and efficiency in tasks related to pattern recognition. Possible solutions include (1) industrial applications such as adaptive practical nonlinear model predictive control [322] and digital twins [323]; (2) integrating RC with deep learning methods such as convolutional and graph neural networks [324,325,326,262,263]. Lastly, the third category is to keep investigating new architectures and mechanisms in physical hardware that are suitable for RC implementations.…”
Section: Physical Rc and Extremely Efficient Hardwarementioning
confidence: 99%
See 1 more Smart Citation
“…The second category delves into the exploration of novel model designs and applications of RC, with an aim to enhance computational performance and efficiency in tasks related to pattern recognition. Possible solutions include (1) industrial applications such as adaptive practical nonlinear model predictive control [322] and digital twins [323]; (2) integrating RC with deep learning methods such as convolutional and graph neural networks [324,325,326,262,263]. Lastly, the third category is to keep investigating new architectures and mechanisms in physical hardware that are suitable for RC implementations.…”
Section: Physical Rc and Extremely Efficient Hardwarementioning
confidence: 99%
“…Last but not least, some recent works from 2019 have merged RC with other systems or paradigms such as deep learning [49,50,218,239,92,117,324,325,326,262,263]. Moreover, some authors have started investigating modifying the foundations (e.g., NG-RC [208]).…”
Section: Hybrids and New Foundationsmentioning
confidence: 99%
“…Dense connections use the structure of snowball connections and consistently up to more convolutional layers, and the snowball structure could concatenate multi-scale features incrementally [9]. The multi-scale information of graphs could be helpful for enhancing the expressiveness of GNNs on graph classification [26,27,28].…”
Section: Related Workmentioning
confidence: 99%
“…• DiffPool [7], GIN [20], PGCN [26], and MRGNN [28]: There are graph neural networks which have a special design for graph classification. GIN is a well-known typical GNN that was expected to achieve the ability as the Weisfeiler-Lehman graph isomorphism test, but a graph-level readout function of GIN was specially designed to produce the embedding of the entire graph for graph classification tasks [20].…”
Section: Baselinementioning
confidence: 99%
“…Another path that has been recently pursued to define efficient architectures is reducing the excess complexity of common DGNs by removing the non-linearities [11,12]. In addition, Reservoir Computing (RC) approaches that provide a way to implement extremely efficient alternatives to end-to-end training of Neural Networks have been extended to graph domains [13,14,15]. We also remark that message passing is not limited to neural architectures: fully unsupervised, deep and probabilistic models exist as well [16,17,18].…”
Section: The Building Blocksmentioning
confidence: 99%