2022
DOI: 10.3390/math10244730
|View full text |Cite
|
Sign up to set email alerts
|

Perceptron: Learning, Generalization, Model Selection, Fault Tolerance, and Role in the Deep Learning Era

Abstract: The single-layer perceptron, introduced by Rosenblatt in 1958, is one of the earliest and simplest neural network models. However, it is incapable of classifying linearly inseparable patterns. A new era of neural network research started in 1986, when the backpropagation (BP) algorithm was rediscovered for training the multilayer perceptron (MLP) model. An MLP with a large number of hidden nodes can function as a universal approximator. To date, the MLP model is the most fundamental and important neural networ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
21
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 27 publications
(21 citation statements)
references
References 351 publications
0
21
0
Order By: Relevance
“…Desire and Shengzhi reported that the hidden layer is a squeeze-and-excitation one. a = italice n italice n italice n italice n where n is the input function and a is the output function; the transfer function is the output, which is a pure linear function a = n . Ke et al derived a scaled conjugate gradient algorithm to train the architecture for precaution of the output, as shown in eq . There are totally six sets (one undamaged and five damaged).…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…Desire and Shengzhi reported that the hidden layer is a squeeze-and-excitation one. a = italice n italice n italice n italice n where n is the input function and a is the output function; the transfer function is the output, which is a pure linear function a = n . Ke et al derived a scaled conjugate gradient algorithm to train the architecture for precaution of the output, as shown in eq . There are totally six sets (one undamaged and five damaged).…”
Section: Resultsmentioning
confidence: 99%
“…With the generalized differential quadrature method (GDQ), the r th order derivative of a function f ( x ) with n discrete grid points can be expressed as higher-order derivatives with any number and distribution of grid points, as shown in eq . Shu’s method provided a great deal of versatility in how the method can be applied to tackle different engineering challenges. , true( f T ( x ) x T true) x i = j = 1 n c i j true( r true) f j where x i represents different points in the variable region, and f j and C ij ( r ) are the function values and related weighting coefficients, respectively.…”
Section: Differential Quadrature Methodsmentioning
confidence: 99%
“…The multi-layer perceptron model is important in the sensors/biosensors field due to: the learning of complex, non-linear models (problems), handling large amounts of input data (if necessary), making quick predictions after training and achieving the same accuracy with smaller samples, and, additionally, it can be trained in real time. 33…”
Section: Methodsmentioning
confidence: 99%
“…The multi-layer perceptron model is important in the sensors/biosensors eld due to: the learning of complex, non-linear models (problems), handling large amounts of input data (if necessary), making quick predictions aer training and achieving the same accuracy with smaller samples, and, additionally, it can be trained in real time. 33 The procedures of ANN performances were as follows: for training, 14 concentrations from 0 to 800 mM for combinations of three pHs and three temperatures were used (126 total samples). Independently, 5 concentrations were prepared for the above combination of pHs and temperatures as test samples (45 total samples).…”
Section: Br-mss Biosensor Experimental Conguration and Data Acquisitionmentioning
confidence: 99%
See 1 more Smart Citation