2024
DOI: 10.1109/tnnls.2022.3203315
|View full text |Cite
|
Sign up to set email alerts
|

LaplaceNet: A Hybrid Graph-Energy Neural Network for Deep Semisupervised Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(3 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…In the second category, there are semi-supervised approaches, such as MixMatch [1], FixMatch [56] and FlexMatch [77], and others [1,41,48,52,69] that use pseudo-labeling or selflabeling, where unlabelled data is assigned pseudo-labels. Generative or teacher-student or ensemble models can also be listed among the semi-supervised approaches [12,16,23,33,38,49,53,55,59,64,65].…”
Section: Introductionmentioning
confidence: 99%
“…In the second category, there are semi-supervised approaches, such as MixMatch [1], FixMatch [56] and FlexMatch [77], and others [1,41,48,52,69] that use pseudo-labeling or selflabeling, where unlabelled data is assigned pseudo-labels. Generative or teacher-student or ensemble models can also be listed among the semi-supervised approaches [12,16,23,33,38,49,53,55,59,64,65].…”
Section: Introductionmentioning
confidence: 99%
“…Another body of work [52][53][54][55] that is similar to the former mentioned works but focuses more on producing fake labels also known as pseudo labels that are eventually becoming accurate labelled data. Pseudo-labeling is one of the methods to implicitly minimize the class overlap entropy thus achieving low density separation [23,46].…”
Section: Semi-supervised Learningmentioning
confidence: 99%
“…Typically, their architectures involve a teacher that generates pseudo labels and a student that learns from the pseudo labels in an iterative way. However, there is also other work by [55] demonstrated effectiveness of self-training (doesn't involve teacher) data distillation [57] and graphical propagation [24] in generating pseudo labels strictly with data augmentation. According to [56], distillation is to transfer knowledge from the cumbersome model (teacher) to the distilled model (student) using the soft target (pseudo label) of a transfer set.…”
Section: Semi-supervised Learningmentioning
confidence: 99%