2022
DOI: 10.1007/s11633-022-1362-z
|View full text |Cite
|
Sign up to set email alerts
|

Causal Reasoning Meets Visual Representation Learning: A Prospective Study

Abstract: Visual representation learning is ubiquitous in various real-world applications, including visual comprehension, video understanding, multi-modal analysis, human-computer interaction, and urban computing. Due to the emergence of huge amounts of multimodal heterogeneous spatial/temporal/spatial-temporal data in the big data era, the lack of interpretability, robustness, and out-of-distribution generalization are becoming the challenges of the existing visual models. The majority of the existing methods tend to … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 27 publications
(11 citation statements)
references
References 216 publications
0
11
0
Order By: Relevance
“…The ability to judge the goodness of received information enables us to select positive information while eliminating negative ones. This concept has been applied to various computer vision tasks [28][29][30].…”
Section: Label Enhancement Methodsmentioning
confidence: 99%
“…The ability to judge the goodness of received information enables us to select positive information while eliminating negative ones. This concept has been applied to various computer vision tasks [28][29][30].…”
Section: Label Enhancement Methodsmentioning
confidence: 99%
“…For explaining black-box visual classifiers, [56] formulated a causal extension to the paradigm of instance-wise features selection and obtain a subset of input features that has the greatest causal effect on the model's output. [57] conducted a comprehensive review of existing causal reasoning methods for visual representation learning and pointed out the importance of causal reasoning in visual representation learning.…”
Section: B Causal Learningmentioning
confidence: 99%
“…Traditional feature learning methods are prone to learning pseudo-correlation properties introduced by confounding factors, which is not conducive to model generalisation across domains 11 . Causal inference can eliminate this pseudocorrelation by replacing the conditional distribution with an intervening distribution 11 .…”
Section: Causal Mechanismsmentioning
confidence: 99%
“…Traditional feature learning methods are prone to learning pseudo-correlation properties introduced by confounding factors, which is not conducive to model generalisation across domains 11 . Causal inference can eliminate this pseudocorrelation by replacing the conditional distribution with an intervening distribution 11 . In the image task, Wang 12 proposed a causal attention module to help deep models learn causal features with robustness by annotating contextual information obfuscation factors in an unsupervised manner.…”
Section: Causal Mechanismsmentioning
confidence: 99%