2022
DOI: 10.1021/acs.iecr.2c02639
|View full text |Cite
|
Sign up to set email alerts
|

Modeling and Control of Nonlinear Processes Using Sparse Identification: Using Dropout to Handle Noisy Data

Abstract: Sparse identification of nonlinear dynamics (SINDy) is a recent nonlinear modeling technique that has demonstrated superior performance in modeling complex time-series data in the form of first-order ordinary differential equations (ODEs), which are explicit and continuous in time. However, a crucial step in the SINDy algorithm involves estimating the time derivative of the states from the discrete, measured data. Therefore, the presence of noise can greatly deteriorate the performance if it is not carefully c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 75 publications
(97 reference statements)
0
4
0
Order By: Relevance
“…From this perspective, sparsity can be achieved through subset selection from the library using feature selection techniques in machine learning. This approach serves the purpose of identifying relevant functions. , To this end, in this study, we employ SFS as a sparsity-promoting method to identify the essential functions by iteratively selecting them from a nonlinear library that make the greatest contribution to the user-defined evaluation metric such as root-mean-squared error (RMSE) and coefficient of determination ( R 2 ). It starts with an empty subset and iteratively adds one feature at a time until a user-defined stopping criterion is met .…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…From this perspective, sparsity can be achieved through subset selection from the library using feature selection techniques in machine learning. This approach serves the purpose of identifying relevant functions. , To this end, in this study, we employ SFS as a sparsity-promoting method to identify the essential functions by iteratively selecting them from a nonlinear library that make the greatest contribution to the user-defined evaluation metric such as root-mean-squared error (RMSE) and coefficient of determination ( R 2 ). It starts with an empty subset and iteratively adds one feature at a time until a user-defined stopping criterion is met .…”
Section: Methodsmentioning
confidence: 99%
“…Traditionally, first-principles models are employed in many fields, such as biochemical science and manufacturing systems, to represent the nonlinear dynamics in the system. In real-world industrial processes, complex nonlinear behaviors and intricate input–output interactions cannot be captured by limited first-principles knowledge of the systems, leading to plant-model mismatch. The rapid increase in computational power and advancement in machine learning have paved the way for data-driven modeling approaches to emerge as a viable alternative to traditional first-principles models. The primary advantage of adopting data-driven approaches lies in the structural flexibility in modeling complex systems.…”
Section: Introductionmentioning
confidence: 99%
“…Data pre-processing techniques, such as standardization and normalization, are used to make variables that have different scales comparable. This helps machine learning algorithms to make more accurate and consistent predictions (Fukami et al, 2021;Rubio-Herrero et al, 2022;Abdullah et al, 2022). Therefore, microcystin, dissolved oxygen and evaporation values were normalized due to their significant differences in scales.…”
Section: Data Preprocessingmentioning
confidence: 99%
“…polynomials or trigonometric functions) applied to the data. This approach has lead to many related algorithms and applications, see [21][22][23][24][25][26][27][28][29][30][31][32][33][34] and the reference within. The sparse optimization framework for learning governing equations was proposed in [35] along with an approach to discovering partial differential equations using a dictionary of derivatives.…”
Section: Introductionmentioning
confidence: 99%