2012
DOI: 10.1109/tnnls.2012.2205018
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical Approach for Multiscale Support Vector Regression

Abstract: Support vector regression (SVR) is based on a linear combination of displaced replicas of the same function, called a kernel. When the function to be approximated is nonstationary, the single kernel approach may be ineffective, as it is not able to follow the variations in the frequency content in the different regions of the input space. The hierarchical support vector regression (HSVR) model presented here aims to provide a good solution also in these cases. HSVR consists of a set of hierarchical layers, eac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
26
0

Year Published

2012
2012
2015
2015

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 39 publications
(26 citation statements)
references
References 21 publications
0
26
0
Order By: Relevance
“…(1) For kernel function selection, many researchers integrate multiple-kernels learning [6,9] or construct some new kernel functions based on some prior knowledge available for the specific problems [16,17]. With the increasing number of kernels being available, it is not obvious which kernel function is a suitable one for a specific problem at hand.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…(1) For kernel function selection, many researchers integrate multiple-kernels learning [6,9] or construct some new kernel functions based on some prior knowledge available for the specific problems [16,17]. With the increasing number of kernels being available, it is not obvious which kernel function is a suitable one for a specific problem at hand.…”
Section: Introductionmentioning
confidence: 99%
“…Many researchers have pointed out that three crucial problems existing in SVR urgently need to be addressed: (1) How to choose or construct an appropriate kernel to complete forecasting problems [8,9]; (2) How to optimize parameters of SVR to improve the quality of prediction [10,11]; (3) How to construct a fast algorithm to operate in presence of large datasets [12,13]. With unsuitable kernel functions or hyperparameter settings, SVR may lead to poor prediction results.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…This mapping is obtained through nonlinear parametric functions, named kernels. However, the performance of the classifier depends critically on the parameters involved; these are determined through non-linear optimization that turns out time consuming and not always converges to the global minumum [29,33]. Moreover, SVMs do not have the flexibility to add new classes / images as they need to be retrained from scratch in this case.…”
Section: Introductionmentioning
confidence: 99%
“…As SVM can be classified into SVM classification [23] and SVM regression (also called support vector regression (SVR)] [24]- [26], we herein focus on SVR with an ε-insensitive loss function, i.e., ε-SVR. Initially, the spatial information expression and processing as well as the fuzzy linguistic expression and rule inference of a 3-D FLC are integrated into spatial fuzzy basis functions (SFBFs), and then the 3-D FLC can be described by a three-layer network structure.…”
Section: Introductionmentioning
confidence: 99%