2014
DOI: 10.1609/aaai.v28i1.8981
|View full text |Cite
|
Sign up to set email alerts
|

Supervised Transfer Sparse Coding

Abstract: A combination of the sparse coding and transfer learning techniques was shown to be accurate and robust in classification tasks where training and testing objects have a shared feature space but are sampled from different underlying distributions, i.e., belong to different domains. The key assumption in such case is that in spite of the domain disparity, samples from different domains share some common hidden factors. Previous methods often assumed that all the objects in the target domain are unlabeled, and t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 17 publications
(10 citation statements)
references
References 30 publications
0
10
0
Order By: Relevance
“…Transferring knowledge from the source gives acceptable idea about the target without the need of instances from the target. This method takes into consideration the assumption of the importance of a few target data in the training [10,37]. The obtained result without adding target data in the training is 30%.…”
Section: Stsc Methods Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…Transferring knowledge from the source gives acceptable idea about the target without the need of instances from the target. This method takes into consideration the assumption of the importance of a few target data in the training [10,37]. The obtained result without adding target data in the training is 30%.…”
Section: Stsc Methods Resultsmentioning
confidence: 99%
“…To discuss the proposed method, we split it into two parts: sparse coding and domain transfer. The algorithm was inspired from [37].…”
Section: Supervised Transfer Sparse Coding Methods (Stsc)mentioning
confidence: 99%
See 1 more Smart Citation
“…Recently, a number of approaches have studied how to conduct semi-supervised learning using non-negative matrix factorization (Koren 2008) (Singh and Gordon 2008) (Jiang et al 2014)(Al-Shedivat et al 2014. For multi-label learning, Liu et al proposed constrained NMF to minimize the difference between input patterns and class memberships (Liu, Jin, and Yang 2006).…”
Section: Semi-supervised Nmfmentioning
confidence: 99%
“…Following this way, many variants of KSVD appear, such as DKSVD (Zhang and Li 2010) and LCKSVD (Jiang, Lin, and Davis 2011). More recently, several methods combine sparse coding with transfer learning (Huang et al 2013;Al-Shedivat et al 2014), which ensure the learned dictionary to be more suitable for testing data. For learning more compact dictionary, group sparse representation is proposed in (Bengio et al 2009;Gao et al 2010;Chi et al 2013).…”
Section: Related Workmentioning
confidence: 99%