2016
DOI: 10.1016/j.neucom.2016.01.105
|View full text |Cite
|
Sign up to set email alerts
|

Modified twin support vector regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
8
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 28 publications
(8 citation statements)
references
References 42 publications
0
8
0
Order By: Relevance
“…Depending on the importance of the data, given different samples, the penalty is more reasonable. For this reason, various methods [ 42 , 43 , 44 , 45 , 46 ] have been developed in order to study this shortcoming. For example, Xu et al [ 44 ] proposed using the local information present on the sample based on K-nearest neighbor weighted twin support vector regression to improve the prediction accuracy.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Depending on the importance of the data, given different samples, the penalty is more reasonable. For this reason, various methods [ 42 , 43 , 44 , 45 , 46 ] have been developed in order to study this shortcoming. For example, Xu et al [ 44 ] proposed using the local information present on the sample based on K-nearest neighbor weighted twin support vector regression to improve the prediction accuracy.…”
Section: Introductionmentioning
confidence: 99%
“…For example, Xu et al [ 44 ] proposed using the local information present on the sample based on K-nearest neighbor weighted twin support vector regression to improve the prediction accuracy. By clustering based on the similarity of training data, Parastalooi et al [ 45 ] proposed an improved twin support vector regression. Ye [ 46 ] proposed an effective weighted Lagrangian -twin support vector regression (WL- -TSVR) with quadratic loss function, in which the weight matrix D was introduced in order to reduce the outlier pair to a certain extent Regression of the influence of variables, so as to impose different penalties on samples.…”
Section: Introductionmentioning
confidence: 99%
“…Zhao et al [26] proposed the notion of twin hyperplanes with the fast speed of least squares support vector regression (LSSVR) yields a new regressor, termed as twin least squares support vector regression (TLSSVR). N.Parastalooi et al [27] proposed modified twin support vector regression (MTSVR) for data regression. Peng [28] proposed a pair of quadratic programming problems (QPP) for directly optimizing TSVR in primal space (PTSVR) based on a series of linear equations.…”
Section: Introductionmentioning
confidence: 99%
“…In 2010, Peng [15] proposed a twin support vector regression (TSVR), which can be used to establish the prediction model for industrial data. After that, some improved TSVR methods [16][17][18][19][20][21][22] were proposed. By introducing a K-nearest neighbor (KNN) weighted matrix into the optimization problem in TSVR, the modified algorithms [16,19] were proposed to improve the performance of TSVR.…”
Section: Introductionmentioning
confidence: 99%
“…To solve the ill-conditioned problem in the dual objective functions of the traditional TSVR, an implicit Lagrangian formulation for TSVR [18] was proposed to ensure that the matrices in the formulation are always positive semidefinite matrices. Parastalooi et al [21] added a new term into the objective function to obtain structural information of the input data. By comparing with the neural network technology, the disadvantage of neural network is that the optimization process may fall into the local optimum.…”
Section: Introductionmentioning
confidence: 99%