2022
DOI: 10.1007/978-3-031-19821-2_6
|View full text |Cite
|
Sign up to set email alerts
|

Semantic-Aware Fine-Grained Correspondence

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(1 citation statement)
references
References 55 publications
0
1
0
Order By: Relevance
“…Motivated by the success of multi-task learning in various tasks [63][64][65][66], we derive the overall training loss of MMCL as below:…”
Section: Self-distillation Layermentioning
confidence: 99%
“…Motivated by the success of multi-task learning in various tasks [63][64][65][66], we derive the overall training loss of MMCL as below:…”
Section: Self-distillation Layermentioning
confidence: 99%