1995
DOI: 10.1103/physrevlett.75.3594
|View full text |Cite
|
Sign up to set email alerts
|

Neural Network Differential Equation and Plasma Equilibrium Solver

Abstract: A new generally applicable method to solve differential equations, based on neural networks, is proposed. Straightforward to implement, finite differences and coordinate transformations are not used. The neural network provides a flexible and compact base for representing the solution, found through the global minimization of an error functional. As a proof of principle, a two-dimensional ideal magnetohydrodynamic plasma equilibrium is solved. Since no particular topology is assumed, the technique is especiall… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

1
72
0

Year Published

2008
2008
2023
2023

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 118 publications
(73 citation statements)
references
References 9 publications
1
72
0
Order By: Relevance
“…Observe that the non-negativity constraint on the error variables are no longer required, because if any q i (or p j ) is negative, then (15) can be reduced by letting q i (or p j ) to be equal to zero, still satisfying constraints (16). Furthermore, in the sequel we will show how the term b 2 in the objective function and the equality constraints along the lines of Suykens et al [21,22] allow us to obtain a simple, closed form solution.…”
Section: Rlsvrd: the Proposed Svr Approach To Learning With Derivativementioning
confidence: 99%
See 1 more Smart Citation
“…Observe that the non-negativity constraint on the error variables are no longer required, because if any q i (or p j ) is negative, then (15) can be reduced by letting q i (or p j ) to be equal to zero, still satisfying constraints (16). Furthermore, in the sequel we will show how the term b 2 in the objective function and the equality constraints along the lines of Suykens et al [21,22] allow us to obtain a simple, closed form solution.…”
Section: Rlsvrd: the Proposed Svr Approach To Learning With Derivativementioning
confidence: 99%
“…Estimation of a function along with its derivatives, or partial derivatives, is an important problem with diverse applications [9,15]. Derivatives of estimated static relations are often used for linearization in control and in extended Kalman filtering [3].…”
Section: Introductionmentioning
confidence: 99%
“…and (14) After solving the preceding system of equations for , , we can substitute these values into (9). Subsequently, using (8), we obtain a new symbolic expression for .…”
Section: Incorporation Of Boundary Conditionsmentioning
confidence: 99%
“…Over the last decade, a number of meshfree schemes based on moving least-squares approximation [3], [4], radial basis functions [5]- [8], and feedforward neural networks [9]- [14] have been proposed in the literature. In these, the field variable is approximated as (1) where is a trial function and , are undetermined coefficients.…”
mentioning
confidence: 99%
“…In 1994, Dissanayake and Phan-Thien [ solutions of two second-order elliptic problems up to six decimal digits precision were given by Van Milligen et al [2]. In 1998, Lagaris et al [3] introduced an artificial neural network for the solution of the second-order nonlinear equations with mixed boundary conditions (Dirichlet and Neumann) up to seven decimal digits precision.…”
Section: Introductionmentioning
confidence: 98%