2021
DOI: 10.1038/s41598-021-97131-8
|View full text |Cite
|
Sign up to set email alerts
|

Modeling hydrogen solubility in hydrocarbons using extreme gradient boosting and equations of state

Abstract: Due to industrial development, designing and optimal operation of processes in chemical and petroleum processing plants require accurate estimation of the hydrogen solubility in various hydrocarbons. Equations of state (EOSs) are limited in accurately predicting hydrogen solubility, especially at high-pressure or/and high-temperature conditions, which may lead to energy waste and a potential safety hazard in plants. In this paper, five robust machine learning models including extreme gradient boosting (XGBoost… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
28
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 69 publications
(28 citation statements)
references
References 67 publications
0
28
0
Order By: Relevance
“…In other words, it repeatedly chooses the training inputs in order to complement several classifiers and apply the proper weight for every classifier depending on its performance, with larger weights allocated to miscategorized data sets. The following are the common parts of the AdaBoost procedure 71 :…”
Section: Model Developmentmentioning
confidence: 99%
“…In other words, it repeatedly chooses the training inputs in order to complement several classifiers and apply the proper weight for every classifier depending on its performance, with larger weights allocated to miscategorized data sets. The following are the common parts of the AdaBoost procedure 71 :…”
Section: Model Developmentmentioning
confidence: 99%
“…Moreover, the histogram approach does not necessitate extra archiving of pre-sorted findings and results can be saved in an 8-bit integer upon feature determination, which reduces storage use to 1/8 of what it was before. In spite of this, the model's efficiency suffers because of the strict dividing mechanism (Mohammadi et al, 2021).…”
Section: Machine Learning Algorithmsmentioning
confidence: 99%
“…The results demonstrate that the XGBoost travel time prediction model considerably enhances the performance and efficiency. XGBoost algorithm is also utilized to develop models for different domains such as estimating the hydrogen solubility in hydrocarbons [40] and predicting flooding susceptibility [41].…”
Section: Related Work: Comparative Analysis Of Travel Time Prediction...mentioning
confidence: 99%
“…Compared with XGBOOST, LightGBM is computationally less expensive and has better prediction accuracy [44]. We used the standard hyper-parameters of the two learning algorithms [40,41,45]: learning_rate, colsample_bytree, n_estimators, and max_depth for XGBoost, and learning_rate, bagging_frequency, n_estimators, and max_dept for LightGBM.…”
Section: Model Training and Tuningmentioning
confidence: 99%