2022
DOI: 10.1021/acs.iecr.2c00526
|View full text |Cite
|
Sign up to set email alerts
|

Toward Faster Operational Optimization of Cascaded MSMPR Crystallizers Using Multiobjective Support Vector Regression

Abstract: Mixed-suspension mixed-product removal (MSMPR) crystallization process is critical for optimal separation and purification operations in pharmaceutical and fine chemical industries. To achieve this, detailed mathematical model-based optimization is the current practice, which is reported as an extremely time-consuming exercise, prohibiting their online implementation. To facilitate faster optimization, a novel datadriven modeling and optimization algorithm has been proposed in this work. Limited number of high… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 24 publications
(9 citation statements)
references
References 59 publications
0
9
0
Order By: Relevance
“…Notably, the quality of surrogate construction strongly depends on the distribution of training points, sample size, and optimally adjustments of hyperparameters of the surrogate. Hyperparameter optimization along with control on a number of expensive simulations keeping in mind the model doesn't get overfitted has been performed in literature 59 . Even though stochastic learning theory is responsible for the development of Kriging, the bounds on the hyperparameters are typically fixed arbitrarily 57 , 60 .…”
Section: Background Of Study and Motivationsmentioning
confidence: 99%
“…Notably, the quality of surrogate construction strongly depends on the distribution of training points, sample size, and optimally adjustments of hyperparameters of the surrogate. Hyperparameter optimization along with control on a number of expensive simulations keeping in mind the model doesn't get overfitted has been performed in literature 59 . Even though stochastic learning theory is responsible for the development of Kriging, the bounds on the hyperparameters are typically fixed arbitrarily 57 , 60 .…”
Section: Background Of Study and Motivationsmentioning
confidence: 99%
“…These include the following: (i) robust optimization of computationally expensive networks employ rigorous cross-validation methodology for hyperparameter tuning 34 and (ii) multiobjective optimization of cascaded mixed-suspension mixed-product removal crystallizers. 35 In the current work, the data set is randomly split into a training set and a testing set using an 80− 20 split. We use R-squared (R 2 ) and Mean Absolute Error (MAE) as evaluation metrics to assess the performance of our model on the test set (Table 1).…”
Section: Resultsmentioning
confidence: 99%
“…The details of this method can be read in the article published by Chowdhury et al [8] The automation of meta parameter estimation by neural architectural search method can be found in other articles also. [10][11][12][13][14]…”
Section: Modelling Using Artificial Neural Networkmentioning
confidence: 99%
“…By applying neural architectural search (NAS), the correct model structure can be found balancing the accuracy of the prediction and overfitting issues of the model. [10][11][12][13][14] However, once the activation function is finalized, then the activation functions are not changed during neural model building. Only the best set of weights and biases are found to minimize the error.…”
Section: Introductionmentioning
confidence: 99%