2021
DOI: 10.3390/s21030930
|View full text |Cite
|
Sign up to set email alerts
|

Research on the Prediction of Green Plum Acidity Based on Improved XGBoost

Abstract: The acidity of green plum has an important influence on the fruit’s deep processing. Traditional physical and chemical analysis methods for green plum acidity detection are destructive, time-consuming, and unable to achieve online detection. In response, a rapid and non-destructive detection method based on hyperspectral imaging technology was studied in this paper. Research on prediction performance comparisons between supervised learning methods and unsupervised learning methods is currently popular. To furt… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
9
1

Relationship

1
9

Authors

Journals

citations
Cited by 21 publications
(9 citation statements)
references
References 19 publications
0
9
0
Order By: Relevance
“…The diagram and other detailed parameters (e.g. the spectral resolution was 2.8 nm and image resolution was 800 × 664 pixels) of this HSI system can be found in a previous work [ 23 ].…”
Section: Methodsmentioning
confidence: 99%
“…The diagram and other detailed parameters (e.g. the spectral resolution was 2.8 nm and image resolution was 800 × 664 pixels) of this HSI system can be found in a previous work [ 23 ].…”
Section: Methodsmentioning
confidence: 99%
“…The LR algorithm is often thought of as a traditional algorithm, but is essentially a form of machine learning ( 34 ). The XGBoost is a ML approach that has the unique ability to integrate missing data quickly and flexibly, as well as to assemble poor prediction models into a more accurate one ( 35 , 36 ). The RF is a ML classifier that employs multiple trees to train and predict samples.…”
Section: Discussionmentioning
confidence: 99%
“…At the same time, XGBoost also supports column sampling to avoid overfitting and reduce the computational workload. After each iteration, XGBoost assigns the learning rate to leaf nodes, reducing the weight of each tree and providing better space for subsequent learning ( Liu et al., 2021b ).…”
Section: Methodsmentioning
confidence: 99%