2020
DOI: 10.1016/j.ejor.2020.04.032
|View full text |Cite
|
Sign up to set email alerts
|

Clusterwise support vector linear regression

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0
2

Year Published

2020
2020
2025
2025

Publication Types

Select...
7
2

Relationship

1
8

Authors

Journals

citations
Cited by 17 publications
(7 citation statements)
references
References 44 publications
0
5
0
2
Order By: Relevance
“…However, there are some successful exceptions. For example, abs-linear forms of prediction tasks are solved using a successive piecewise linearization method in [12,13], a primal-dual prox method for problems in which both the loss function and the regularizer are nonsmooth is developed in [14], various nonsmooth optimization methods are applied to solve clustering, classification, and regression problems in [5,6,[15][16][17], and finally, nonsmooth optimization approaches are combined with support vector machines in [18][19][20].…”
Section: Related Workmentioning
confidence: 99%
“…However, there are some successful exceptions. For example, abs-linear forms of prediction tasks are solved using a successive piecewise linearization method in [12,13], a primal-dual prox method for problems in which both the loss function and the regularizer are nonsmooth is developed in [14], various nonsmooth optimization methods are applied to solve clustering, classification, and regression problems in [5,6,[15][16][17], and finally, nonsmooth optimization approaches are combined with support vector machines in [18][19][20].…”
Section: Related Workmentioning
confidence: 99%
“…However, the value of the corresponding rule can be analyzed through a series of attributes of association rules. (1) Support representing the support of the union of the former and the latter items ( Joki et al, 2020 ); (2) Confidence involves the rules to identify the rule support/rule leader ( Lin and James, 2020 ); (3) Lift refers to the ratio of the probability of containing an left-hand side (LHS) and an right-hand side (RHS) to the probability of containing RHS ( Musab et al, 2019 ). It reflects the correlation between the LHS and the RHS in association rules.…”
Section: Specific Presentation Of Global Cyberspace Security Issuesmentioning
confidence: 99%
“…Cluster regression was first mentioned by Späth [18], which has given rise to new ideas and vitality in the era of big data. Joki et al [19] introduced the support vector machine model in machine learning into CLR (Cluster-wise linear regression), transformed the problem into an unconstrained non-smooth optimization problem, and designed a method based on an incremental algorithm and double beam method combined with the DC optimization method. Numerical experiments verify the reliability and effectiveness of the method.…”
Section: Introductionmentioning
confidence: 99%