Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2010
DOI: 10.1145/1835804.1835886
|View full text |Cite
|
Sign up to set email alerts
|

Boosting with structure information in the functional space

Abstract: Boosting is a very successful classification algorithm that produces a linear combination of "weak" classifiers (a.k.a. base learners) to obtain high quality classification models. In this paper we propose a new boosting algorithm where base learners have structure relationships in the functional space. Though such relationships are generic, our work is particularly motivated by the emerging topic of pattern based classification for semi-structured data including graphs. Towards an efficient incorporation of t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
49
0

Year Published

2011
2011
2016
2016

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 35 publications
(49 citation statements)
references
References 44 publications
0
49
0
Order By: Relevance
“…Filter approaches are independent of any specific learning algorithms [22,2], while wrapper approaches involve learning algorithms as part of the evaluation procedure [39,23]. Some representative wrapping methods include Sequential Floating Selection (SFS) [39], Sequential Forward Floating Selection (SFFS) [32] and sparse logistic regression based methods [27,29,14,34].…”
Section: Metaheuristics For Feature Selectionmentioning
confidence: 99%
“…Filter approaches are independent of any specific learning algorithms [22,2], while wrapper approaches involve learning algorithms as part of the evaluation procedure [39,23]. Some representative wrapping methods include Sequential Floating Selection (SFS) [39], Sequential Forward Floating Selection (SFFS) [32] and sparse logistic regression based methods [27,29,14,34].…”
Section: Metaheuristics For Feature Selectionmentioning
confidence: 99%
“…For subgraph feature based methods, the major goal is to identify significant subgraphs which can be used as signature for different classes [7], [29], [5], [9], [30]. By using subgraph features selected from the graph set, one can easily transfer graphs into a vector space so existing machine learning methods can be applied for classification [7], [29].…”
Section: Related Workmentioning
confidence: 99%
“…After obtaining subgraph features, one can also employ Boosting algorithms for graph classification [34], [5], [9], [35]. In [9], the authors proposed to boost the subgraph decision stumps from frequent subgraphs, which means they first need to provide a minimum support to mine a set of frequent subgraphs and then utilize the function space knowledge for boosting.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…This is an extension of the l 2 norm regularized Laplacian on a single vector in [12]. With this addition, we obtain the following optimization problem:…”
Section: Graph Guided Joint Sparse Pcamentioning
confidence: 99%