2023
DOI: 10.48550/arxiv.2303.10735
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

SKED: Sketch-guided Text-based 3D Editing

Abstract: Sketch-guided, Text-based 3D editing method. Taking a pretrained Neural Radiance Field as input, multiview sketches determining the coarse region of edit and a text-prompt, our method is able to generate a localized, meaningful edit.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 51 publications
0
1
0
Order By: Relevance
“…Recently, thanks to the development of score distillation sampling technique, text-guided editing has emerged as a promising direction with great potential. SKED (Mikaeili et al 2023) possesses the capability to edit 3D scenes with multi-view sketches. Latent-NeRF (Metzer et al 2023) and Fantasia3D (Chen et al 2023) realize sketchshape guidance by relaxed geometric constraints.…”
Section: Related Workmentioning
confidence: 99%
“…Recently, thanks to the development of score distillation sampling technique, text-guided editing has emerged as a promising direction with great potential. SKED (Mikaeili et al 2023) possesses the capability to edit 3D scenes with multi-view sketches. Latent-NeRF (Metzer et al 2023) and Fantasia3D (Chen et al 2023) realize sketchshape guidance by relaxed geometric constraints.…”
Section: Related Workmentioning
confidence: 99%