2022
DOI: 10.1088/2632-2153/ac9885
|View full text |Cite
|
Sign up to set email alerts
|

Twin neural network regression is a semi-supervised regression algorithm

Abstract: Twin neural network regression (TNNR) is trained to predict differences between the target values of two different data points rather than the targets themselves. By ensembling predicted differences between the targets of an unseen data point and all training data points, it is possible to obtain a very accurate prediction for the original regression problem. Since any loop of predicted differences should sum to zero, loops can be supplied to the training data, even if the data points themselves within loops a… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 15 publications
0
3
0
Order By: Relevance
“…Semi-supervised regression COREG [22] Uses two models to generate pseudo-labels from unlabeled data SSDKL [33] Combines neural networks and kernel methods to minimize prediction variance of unlabeled data TNNR [23] Trains the difference between predictions of two unlabeled data pairs through loop consistency UCVME [21] Improves the quality of uncertainty estimates on unlabeled data using a…”
Section: Category Methods Descriptionmentioning
confidence: 99%
See 2 more Smart Citations
“…Semi-supervised regression COREG [22] Uses two models to generate pseudo-labels from unlabeled data SSDKL [33] Combines neural networks and kernel methods to minimize prediction variance of unlabeled data TNNR [23] Trains the difference between predictions of two unlabeled data pairs through loop consistency UCVME [21] Improves the quality of uncertainty estimates on unlabeled data using a…”
Section: Category Methods Descriptionmentioning
confidence: 99%
“…However, COREG comes with increased computational costs because of the separate training of the two K-nearest neighbors regression models and reduced efficiency caused by information transfer between the models. Wetzel et al, (2022) [23] proposed twin neural network regression (TNNR) that trains the difference between the predictions of two unlabeled data pairs through loop consistency. Although TNNR enables learning from unlabeled data through loop consistency, it requires two or more independent models, increasing computational costs.…”
Section: Related Work a Semi-supervised Learning Methodsmentioning
confidence: 99%
See 1 more Smart Citation