2020
DOI: 10.1109/tfuzz.2019.2904237
|View full text |Cite
|
Sign up to set email alerts
|

Self-Supervised Learning for Specified Latent Representation

Abstract: Current latent representation methods using unsupervised learning have no semantic meaning; thus, it is difficult to directly express their physical task in the real world. To this end, this paper attempts to propose a specified latent representation with physical semantic meaning. First, a few labeled samples are used to generate the framework of the latent space, and these labeled samples are mapped to framework nodes in the latent space. Second, a self-learning method using structured unlabeled samples is p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 33 publications
0
1
0
Order By: Relevance
“…However, unsupervised learning has no label data at all, which may lead to slow speed and low precision [25]. Self-supervised learning uses the input data to generate supervisory information and benefits almost all types of downstream tasks [26,27]. With Google's successful application of reinforcement learning in the Go game, reinforcement learning has attracted the worldwide attention of researchers.…”
Section: Introductionmentioning
confidence: 99%
“…However, unsupervised learning has no label data at all, which may lead to slow speed and low precision [25]. Self-supervised learning uses the input data to generate supervisory information and benefits almost all types of downstream tasks [26,27]. With Google's successful application of reinforcement learning in the Go game, reinforcement learning has attracted the worldwide attention of researchers.…”
Section: Introductionmentioning
confidence: 99%