2018 Eighth International Conference on Information Science and Technology (ICIST) 2018
DOI: 10.1109/icist.2018.8426153
|View full text |Cite
|
Sign up to set email alerts
|

A Memristive Neural Networks Described by Differential-Algebraic Systems

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
26
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(26 citation statements)
references
References 18 publications
0
26
0
Order By: Relevance
“…The learned policies were then tested online in a closed loop within the training distribution and in drastically distinct settings. This experimental protocol allows for the principled assessment of performance and generalization capabilities of liquid networks compared with modern deep models (38,41,42).…”
Section: Fly-to-target Task Trainingmentioning
confidence: 99%
See 2 more Smart Citations
“…The learned policies were then tested online in a closed loop within the training distribution and in drastically distinct settings. This experimental protocol allows for the principled assessment of performance and generalization capabilities of liquid networks compared with modern deep models (38,41,42).…”
Section: Fly-to-target Task Trainingmentioning
confidence: 99%
“…5B) (50). These brain-inspired models are instances of continuous-time (CT) neural networks (35,41) that can be trained via gradient descent in modern automatic differentiation frameworks. Liquid networks exhibit stable and bounded behavior, yield superior expressivity within the family of CT neural models (35,41), and give rise to improved performance on a wide range of time series prediction tasks compared with advanced, recurrent neural network models (50).…”
Section: Liquid Network: Brain-inspired Neural Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…These tools provide differentiable functions for physical simulations, which enable close integration with deep learning frameworks by leveraging their automatic differentiation functionality. Hybrid approaches that combine machine learning techniques with numerical PDE solvers (Wang et al, 2020;Illarramendi et al, 2022), have attracted a significant amount of interest due to their capabilities for generalization (Chen et al, 2018). In this context, neural networks are typically used to model or replace a part of the conventional PDE solver to improve aspects of the solving process.…”
Section: Neural Network With Differentiable Pde Solversmentioning
confidence: 99%
“…For example, a framework useful for automatic learning PDEs from data [24,25] has been proposed. Another group used an adjoint method to learn differential equations parameterized with neural networks [26], while Ayed et al [27] proposed a framework for learning models using a purely data-driven approach in partially observable settings.…”
Section: Introductionmentioning
confidence: 99%