2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2022
DOI: 10.1109/cvpr52688.2022.01877
|View full text |Cite
|
Sign up to set email alerts
|

Contrastive Regression for Domain Adaptation on Gaze Estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
18
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 44 publications
(25 citation statements)
references
References 25 publications
0
18
0
Order By: Relevance
“…Another work by Wang et al (2022) proposes to improve domain adaptation for gaze estimation by adding a contrastive loss term to the L1 loss. They show that their approach is beneficial for adapting a gaze estimation model from one dataset (i.e., one domain) to another, but the approach produces no benefits and even reduces performance for the source dataset.…”
Section: Related Workmentioning
confidence: 99%
“…Another work by Wang et al (2022) proposes to improve domain adaptation for gaze estimation by adding a contrastive loss term to the L1 loss. They show that their approach is beneficial for adapting a gaze estimation model from one dataset (i.e., one domain) to another, but the approach produces no benefits and even reduces performance for the source dataset.…”
Section: Related Workmentioning
confidence: 99%
“…The most notable version of contrastive loss is NT-Xent, highlighted by Chen et al [9]. Previously contrastive loss has been used in the context of classification tasks and is thus not best suited for regression tasks [10]. Wang et al [10] proposed a contrastive loss by using the distribution of the labels as the weights with KL-divergence.…”
Section: Contrastive Learningmentioning
confidence: 99%
“…Previously contrastive loss has been used in the context of classification tasks and is thus not best suited for regression tasks [10]. Wang et al [10] proposed a contrastive loss by using the distribution of the labels as the weights with KL-divergence. Jindal et al [20] recently proposed a GazeCRL, a contrastive learning framework for gaze estimation.…”
Section: Contrastive Learningmentioning
confidence: 99%
See 2 more Smart Citations