2022
DOI: 10.3390/math10152772
|View full text |Cite
|
Sign up to set email alerts
|

Quantitative Analysis of Anesthesia Recovery Time by Machine Learning Prediction Models

Abstract: It is significant for anesthesiologists to have a precise grasp of the recovery time of the patient after anesthesia. Accurate prediction of anesthesia recovery time can support anesthesiologist decision-making during surgery to help reduce the risk of surgery in patients. However, effective models are not proposed to solve this problem for anesthesiologists. In this paper, we seek to find effective forecasting methods. First, we collect 1824 patient anesthesia data from the eye center and then performed data … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 44 publications
0
2
0
Order By: Relevance
“…A drawback of decision trees is that they tend to overfit, which means that they will perform less well on new data. However, if a collection of many decision trees exists, a technique called random forest can make predictions more stable without overfitting [31][32][33][34][35]. Random forests utilize bootstrap aggregating (bagging) to combine multiple decision trees, resulting in higher accuracy by reducing bias from overfitting datasets with a single model (Figure 6) [21,25,[33][34][35][36].…”
Section: Ensemble Techniques: Bagging Random Forest and Boostingmentioning
confidence: 99%
See 1 more Smart Citation
“…A drawback of decision trees is that they tend to overfit, which means that they will perform less well on new data. However, if a collection of many decision trees exists, a technique called random forest can make predictions more stable without overfitting [31][32][33][34][35]. Random forests utilize bootstrap aggregating (bagging) to combine multiple decision trees, resulting in higher accuracy by reducing bias from overfitting datasets with a single model (Figure 6) [21,25,[33][34][35][36].…”
Section: Ensemble Techniques: Bagging Random Forest and Boostingmentioning
confidence: 99%
“…However, if a collection of many decision trees exists, a technique called random forest can make predictions more stable without overfitting [31][32][33][34][35]. Random forests utilize bootstrap aggregating (bagging) to combine multiple decision trees, resulting in higher accuracy by reducing bias from overfitting datasets with a single model (Figure 6) [21,25,[33][34][35][36]. Ensemble techniques have become increasingly popular for ML applications because they improve accuracy and reduce variance [21,25].…”
Section: Ensemble Techniques: Bagging Random Forest and Boostingmentioning
confidence: 99%