2014
DOI: 10.1007/978-3-662-44851-9_14
|View full text |Cite
|
Sign up to set email alerts
|

Bi-directional Representation Learning for Multi-label Classification

Abstract: Multi-label classification is a central problem in many application domains. In this paper, we present a novel supervised bi-directional model that learns a low-dimensional mid-level representation for multilabel classification. Unlike traditional multi-label learning methods which identify intermediate representations from either the input space or the output space but not both, the mid-level representation in our model has two complementary parts that capture intrinsic information of the input data and the o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(10 citation statements)
references
References 18 publications
0
10
0
Order By: Relevance
“…A naïve approach to uncertainty-diversity active learning is to use a weighted combination of uncertainty scores and diversity metrics to select samples. [57][58][59] In their work, Rodriguez et al 59 propose a novel acquisition function for batches of data that contain a sum of both uncertainty and diversity weights. Kirsch, Van Amersfoort, and Gal 60 instead use a modified version of uncertainty scoring, taking the Mutual Information of the whole batch in order to maximize diversity of samples selected together.…”
Section: Uncertainty-diversity Hybrid Active Learning Methodsmentioning
confidence: 99%
“…A naïve approach to uncertainty-diversity active learning is to use a weighted combination of uncertainty scores and diversity metrics to select samples. [57][58][59] In their work, Rodriguez et al 59 propose a novel acquisition function for batches of data that contain a sum of both uncertainty and diversity weights. Kirsch, Van Amersfoort, and Gal 60 instead use a modified version of uncertainty scoring, taking the Mutual Information of the whole batch in order to maximize diversity of samples selected together.…”
Section: Uncertainty-diversity Hybrid Active Learning Methodsmentioning
confidence: 99%
“…1) A reconstruction model Ψ inv : y ′ → y is trained during the reduction phase or after [53]. It allows to reconstruct y from y ′ in the test phase.…”
Section: Label Space Reduction (Y)mentioning
confidence: 99%
“…A major part of the algorithms impose the minimization of the L 2 -norm of the parameters. This benefits solutions with low-value parameters[53] [29].…”
mentioning
confidence: 99%
“…However, current works about representation learning neglect label knowledge, or suffer from the lack of labeled data, or are limited to linear projection. The most related work (Li and Guo 2014), which proposed a bi-directional representation model for multi-label classification, in which the mid-level representation layer is constructed from both input and output spaces. In essence, their network structure is different from ours.…”
Section: Effects On Supervision Informationmentioning
confidence: 99%