Artificial Intelligence in Data Mining 2021
DOI: 10.1016/b978-0-12-820601-0.00011-2
|View full text |Cite
|
Sign up to set email alerts
|

Neural networks for data classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 17 publications
(10 citation statements)
references
References 12 publications
0
10
0
Order By: Relevance
“…Numerically, minimisation is solved by means of the Levenberg–Marquardt algorithm [ 18 ], an iterative method that implements the strengths of the gradient descent algorithm (slow but converging) and the speed of the Gauss–Newton solver.…”
Section: Validation Of the Modelmentioning
confidence: 99%
“…Numerically, minimisation is solved by means of the Levenberg–Marquardt algorithm [ 18 ], an iterative method that implements the strengths of the gradient descent algorithm (slow but converging) and the speed of the Gauss–Newton solver.…”
Section: Validation Of the Modelmentioning
confidence: 99%
“…The BPNN is trained by updating the weight and bias values according to the Levenberg-Marquardt optimization algorithm (Kumaraswamy, 2021) [32]. The model was constructed using an input layer containing the 14 independent variables (C1-C14) reported in Table 1, hidden layers, and an output layer, with only one output corresponding to the C15 or C16 variables.…”
Section: Models and Their Performancementioning
confidence: 99%
“…Voxel-wise processing of DW data requires us to have a mathematical representation or biophysical model, an optimization algorithm, and an objective function that fundamentally designs the goal of the optimizer. Before the advent of DL tools, gradient descent, Newton's method, and the Levenberg-Marquardt algorithm were popular for solving inverse problems (55). These algorithms have been used with objective functions that closely mimic the noise distribution of the data.…”
Section: Maximum Likelihood Framework For Aimentioning
confidence: 99%