2022 20th IEEE Interregional NEWCAS Conference (NEWCAS) 2022
DOI: 10.1109/newcas52662.2022.9842242
|View full text |Cite
|
Sign up to set email alerts
|

A Multilayer Perceptron (MLP) Regressor Network for Monitoring the Depth of Anesthesia

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 27 publications
0
5
0
Order By: Relevance
“…MLP Regressor (MLPR): The Multilayer Perceptron (MLP) is a class of feed-forward Artificial Neural Networks (ANN). An MLP architecture connects the input, hidden, and output layers in a feed-forward way . As a type of supervised learning, MLP uses backpropagation (a type of gradient-descent algorithm in which predetermined error-function values are calculated) to train the network.…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…MLP Regressor (MLPR): The Multilayer Perceptron (MLP) is a class of feed-forward Artificial Neural Networks (ANN). An MLP architecture connects the input, hidden, and output layers in a feed-forward way . As a type of supervised learning, MLP uses backpropagation (a type of gradient-descent algorithm in which predetermined error-function values are calculated) to train the network.…”
Section: Methodsmentioning
confidence: 99%
“…An MLP architecture connects the input, hidden, and output layers in a feedforward way. 82 As a type of supervised learning, MLP uses backpropagation 83 (a type of gradient-descent algorithm in which predetermined error-function values are calculated) to train the network. The values, which were reintroduced into the network following the computation, are used to adjust the weights of each layer's neurons.…”
Section: Machine Learning Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…MLP architecture connects the input, hidden, and output layers in a feed-forward way. 75 As a type of supervised learning, MLP uses backpropagation 76 (a type of gradient-descent algorithm in which predetermined errorfunction values are calculated) to train the network. The values, which were reintroduced into the network following the computation, are used to adjust the weights of each layer's neurons.…”
Section: Machine Learning (Ml) Algorithmsmentioning
confidence: 99%
“…The values, which were reintroduced into the network following the computation, are used to adjust the weights of each layer's neurons. 75 The metrics for all calculations were based on the coefficient of determination (𝑅 + ) and the Root Mean Square Error (RMSE), the former is given by…”
Section: Machine Learning (Ml) Algorithmsmentioning
confidence: 99%