2020
DOI: 10.1109/access.2020.2995615
|View full text |Cite
|
Sign up to set email alerts
|

Twin Least Squares Support Vector Regression of Heteroscedastic Gaussian Noise Model

Abstract: The training algorithm of twin least squares support vector regression (TLSSVR) transforms unequal constraints into equal constraints in a pair of quadratic programming problems, it owns faster computational speed. The classical least squares support vector regression (LSSVR) assumpt that the noise is Gaussian with zero mean and the homoscedastic variance. However, it is found that the noise models in some practical applications satisfy Gaussian distribution with zero mean and heteroscedastic variance. In this… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 39 publications
0
4
0
Order By: Relevance
“…By introducing the idea of twin support vector machines into the regression problem [35], [36], Peng proposed εtwin support vector regression(ε-TSVR) [37], and in the case of linear problems, ε-TSVR considers finding a pair of linear functions like in(12)…”
Section: B Twin Support Vector Regression(tsvr) Theorymentioning
confidence: 99%
“…By introducing the idea of twin support vector machines into the regression problem [35], [36], Peng proposed εtwin support vector regression(ε-TSVR) [37], and in the case of linear problems, ε-TSVR considers finding a pair of linear functions like in(12)…”
Section: B Twin Support Vector Regression(tsvr) Theorymentioning
confidence: 99%
“…Likewise, Formula (4) can be applied to obtain the upper bound prediction function of the TLSSVR model for any new test samples such as Formula (7).…”
Section: Twin Least Squares Support Vector Regressionmentioning
confidence: 99%
“…Secondly, the SVM model abides by the principle of structural risk minimization, rather than empirical risk minimization, thereby restricting the maximum generalization error.Traditional SVM faces challenges, including high computational complexity. To address this issue, various approaches can be taken, such as speeding up the training process with techniques like blocking, sequence minimization (SMO), SVMlight, SVMTORCH, and LIBSVM [6][7][8][9]. Alternatively, SVM models can be modified, such as using simplified Support Vector Machine, linear programming Support Vector Machine , or least squares Support Vector Machine (LSSVM) [10][11][12].…”
Section: Introductionmentioning
confidence: 99%
“…Later, Zhao et al [21] introduced the concept of least squares into TSVR and constructed twin least squares support vector machines (TLSVR). Zhang et al [22], [23] constructed a new model by combining Gauss-Laplace mixed noise distribution and heteroscedasticity Gaussian distribution with TLSVR, and the experimental results show that its prediction performance is better.…”
Section: Introductionmentioning
confidence: 99%