2017
DOI: 10.1109/tvcg.2017.2734426
|View full text |Cite
|
Sign up to set email alerts
|

Natural Environment Illumination: Coherent Interactive Augmented Reality for Mobile and Non-Mobile Devices

Abstract: Augmented Reality offers many applications today, especially on mobile devices. Due to the lack of mobile hardware for illumination measurements, photorealistic rendering with consistent appearance of virtual objects is still an area of active research. In this paper, we present a full two-stage pipeline for environment acquisition and augmentation of live camera images using a mobile device with a depth sensor. We show how to directly work on a recorded 3D point cloud of the real environment containing high d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
35
0

Year Published

2018
2018
2020
2020

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 28 publications
(35 citation statements)
references
References 32 publications
0
35
0
Order By: Relevance
“…e 360°contents, which are called 360°photographs or videos, are captured by several wide-angle cameras, which are stitched and mapped in a spherical or hemispherical space to observe the surrounding scenes based on the camera [2]. Most research works for 360°content creation focus on accurate stitching for spherical mapping [5,6], image-based lighting effects [7][8][9], resolution improvement, and distortion correction of images from wide-angle cameras [10]. Some researchers investigated methods to improve the interactivity in imagebased VR contents by adding URLs to 360°images [11], changing scenes using button clicks [12,13], and adding special effects to reduce the differences between composite boundaries [14].…”
Section: Interactivity In 360°contentsmentioning
confidence: 99%
“…e 360°contents, which are called 360°photographs or videos, are captured by several wide-angle cameras, which are stitched and mapped in a spherical or hemispherical space to observe the surrounding scenes based on the camera [2]. Most research works for 360°content creation focus on accurate stitching for spherical mapping [5,6], image-based lighting effects [7][8][9], resolution improvement, and distortion correction of images from wide-angle cameras [10]. Some researchers investigated methods to improve the interactivity in imagebased VR contents by adding URLs to 360°images [11], changing scenes using button clicks [12,13], and adding special effects to reduce the differences between composite boundaries [14].…”
Section: Interactivity In 360°contentsmentioning
confidence: 99%
“…This then provides all the geometric and lighting information required to insert the virtual content. Zhang et al [54] showed how this kind of approach could be used in an MR application for virtually refurnishing rooms, and Rohmer et al [38] used a similar HDR model to render photorealistic virtual content on mobile and desktop hardware. These approaches capture rich scene information using just an RGBD sensor, but have the disadvantage of requiring the whole scene to be reconstructed before virtual content can be added.…”
Section: Real-world Lighting Capturementioning
confidence: 99%
“…Before projecting a new set of surfels corresponding to the current frame, we first convert the observed LDR colour values to HDR values in a linear colour space. We approximately linearise the input values by applying an inverse gamma correction (γ = 2.2), similarly to [38,54]. We then estimate the exposure and white balance applied by the RGBD colour camera.…”
Section: Hdr Surfel-based Reconstructionmentioning
confidence: 99%
See 2 more Smart Citations