2022
DOI: 10.1109/lra.2022.3196781
|View full text |Cite
|
Sign up to set email alerts
|

TransDSSL: Transformer Based Depth Estimation via Self-Supervised Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 20 publications
(8 citation statements)
references
References 28 publications
0
5
0
Order By: Relevance
“…Chang et al [18] proposed an attentionbased up-sample block to compensate for the texture features and an attention supervision mechanism to provide further guidance for transformer layers. Han et al [19] proposed an attention-based decoder module to enhance fine details while keeping global context from the vision transformer.…”
Section: Vision Transformer For Depth Estimationmentioning
confidence: 99%
“…Chang et al [18] proposed an attentionbased up-sample block to compensate for the texture features and an attention supervision mechanism to provide further guidance for transformer layers. Han et al [19] proposed an attention-based decoder module to enhance fine details while keeping global context from the vision transformer.…”
Section: Vision Transformer For Depth Estimationmentioning
confidence: 99%
“…2 showcases the technical pipeline of our Diffusion-Augmented Depth Prediction (DADP) framework. On autonomous driving scenarios only with sparse annotations [6,16], the key problem is to enforce regional and holistic spatial structures without the unreliable pose estimation in prior arts [14,19,20,51,55]. DADP contains two main components: a depth predictor and a noise predictor.…”
Section: Proposed Methods 31 Overviewmentioning
confidence: 99%
“…Self-supervised Depth Estimation. To enhance depth structures on driving scenes [6,16], prior arts [17,19,20,51,55] seek for the self-supervised paradigm. They jointly optimize a pose module and a depth module by photometric loss [12].…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations