2017
DOI: 10.1371/journal.pone.0189369
|View full text |Cite
|
Sign up to set email alerts
|

Multilayer perceptron architecture optimization using parallel computing techniques

Abstract: The objective of this research was to develop a methodology for optimizing multilayer-perceptron-type neural networks by evaluating the effects of three neural architecture parameters, namely, number of hidden layers (HL), neurons per hidden layer (NHL), and activation function type (AF), on the sum of squares error (SSE). The data for the study were obtained from quality parameters (physicochemical and microbiological) of milk samples. Architectures or combinations were organized in groups (G1, G2, and G3) ge… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
37
0
3

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
4
1

Relationship

2
8

Authors

Journals

citations
Cited by 63 publications
(40 citation statements)
references
References 21 publications
0
37
0
3
Order By: Relevance
“…ANN s have been widely used for prediction (multilayer perceptron - MLP ) and classification (radial basis neuronal networks - RBN ). An artificial neuron can calculate the weighted sum of its inputs and then apply an activation function to obtain a signal that is transmitted to the next neuron [33], [34].…”
Section: Methodsmentioning
confidence: 99%
“…ANN s have been widely used for prediction (multilayer perceptron - MLP ) and classification (radial basis neuronal networks - RBN ). An artificial neuron can calculate the weighted sum of its inputs and then apply an activation function to obtain a signal that is transmitted to the next neuron [33], [34].…”
Section: Methodsmentioning
confidence: 99%
“…Based on the architecture of PI3K/AKT represented by the MCMC transition matrix, the multilayer perceptron ANN model was constructed. The topology for the multi-perceptron ANN was particular given that typical architectures for these models tend to reduce the number of nodes per layer across the model (57). We hypothesize that resulting architecture of the model can be associated with the biological topology of the SP and MCMC; nevertheless, additional analysis should be performed.…”
Section: Resultsmentioning
confidence: 99%
“…where Od is the output decision, ωj and ω0 represent the connection weights, and h is the hidden layer. The hidden layers are also connected to the output layers through a neural connection which holds the output weights [33,[71][72][73][74][75]. Initially, the weights of the connections hold random values until they intersect another connection-a phase in which they are multiplied by the associated weights and that intersection [34].…”
Section: Multilayer Perceptron Neural Network (Mlp)mentioning
confidence: 99%