Proceedings of the 22nd International Conference on Machine Learning - ICML '05 2005
DOI: 10.1145/1102351.1102445
|View full text |Cite
|
Sign up to set email alerts
|

Learning hierarchical multi-category text classification models

Abstract: We present a kernel-based algorithm for hierarchical text classification where the documents are allowed to belong to more than one category at a time. The classification model is a variant of the Maximum Margin Markov Network framework, where the classification hierarchy is represented as a Markov tree equipped with an exponential family defined on the edges. We present an efficient optimization algorithm based on incremental conditional gradient ascent in single-example subspaces spanned by the marginal dual… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
211
1
5

Year Published

2006
2006
2013
2013

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 163 publications
(219 citation statements)
references
References 7 publications
2
211
1
5
Order By: Relevance
“…Hierarchical classification can, however, be seen as a whole rather than a series of local learning tasks; the idea being to optimize the global performance all at once. This approach is adopted in [4][5][6][7][8].…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…Hierarchical classification can, however, be seen as a whole rather than a series of local learning tasks; the idea being to optimize the global performance all at once. This approach is adopted in [4][5][6][7][8].…”
Section: Related Workmentioning
confidence: 99%
“…In [6], Rousu et al presented a kernel-based method in which the classification model is a variant of the maximum margin Markov network framework. This algorithm relies on a decomposition of the problem into single-example subproblems and conditional gradient ascent for optimisation of these subproblems.…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations