2017
DOI: 10.1109/tmag.2017.2702168
|View full text |Cite
|
Sign up to set email alerts
|

An Easy-to-Implement Hysteresis Model Identification Method Based on Support Vector Regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 10 publications
0
7
0
Order By: Relevance
“…The SVR is an expansion of a classical Support Vector Machine (SVM) from pure classification to regression tasks. Similar to SVM it is designed for estimation of high dimensional, nonlinear problems when only a limited number of samples are available [26]. The SVR model implemented in this paper was based on [26].…”
Section: A Preliminary Evaluation Of the Lstmmentioning
confidence: 99%
See 1 more Smart Citation
“…The SVR is an expansion of a classical Support Vector Machine (SVM) from pure classification to regression tasks. Similar to SVM it is designed for estimation of high dimensional, nonlinear problems when only a limited number of samples are available [26]. The SVR model implemented in this paper was based on [26].…”
Section: A Preliminary Evaluation Of the Lstmmentioning
confidence: 99%
“…Similar to SVM it is designed for estimation of high dimensional, nonlinear problems when only a limited number of samples are available [26]. The SVR model implemented in this paper was based on [26]. The hyperparameters of the SVR are chosen as follows: penalty C = 10, kernel = Radial Basis Function (RBF) kernel, kernel coefficient γ = 0.1, margin of tolerance = 0.1.…”
Section: A Preliminary Evaluation Of the Lstmmentioning
confidence: 99%
“…10 . Weight vector w , insensitive loss function ε , penalty factor C to minimize training errors, and and are slack variables (Zhang et al 2017 ). …”
Section: Methodology and Thermodynamics Backgroundmentioning
confidence: 99%
“…A quadratic programming method 32 is employed to solve Equation ( 3) to obtain the optimal Lagrange multipliers α i and α i à . Then the optimal weight vector can be expressed as follows:…”
Section: Svr Modelmentioning
confidence: 99%