2014
DOI: 10.1007/978-3-319-13969-2_19
|View full text |Cite
|
Sign up to set email alerts
|

Interactive Augmented Omnidirectional Video with Realistic Lighting

Abstract: This paper presents the augmentation of immersive omnidirectional video with realistically lit objects. Recent years have known a proliferation of real-time capturing and rendering methods of omnidirectional video. Together with these technologies, rendering devices such as Oculus Rift have increased the immersive experience of users. We demonstrate the use of structure from motion on omnidirectional video to reconstruct the trajectory of the camera. The position of the car is then linked to an appropriate 360… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 28 publications
0
7
0
Order By: Relevance
“…Most of them use a method of applying camera tracking to perspective views of a 360 • video clip before the stitching process. One such method proposed by Michiels et al uses a perspective view from one of the 360 • camera rigs to obtain an undistorted image for eliminating the stitching errors [13]. In addition, Huang et al proposed a method for obtaining stable tracking results, which uses an image correction by overlapping the point where the distortion occurs with the position difference between frames [14].…”
Section: Stereo Visionmentioning
confidence: 99%
“…Most of them use a method of applying camera tracking to perspective views of a 360 • video clip before the stitching process. One such method proposed by Michiels et al uses a perspective view from one of the 360 • camera rigs to obtain an undistorted image for eliminating the stitching errors [13]. In addition, Huang et al proposed a method for obtaining stable tracking results, which uses an image correction by overlapping the point where the distortion occurs with the position difference between frames [14].…”
Section: Stereo Visionmentioning
confidence: 99%
“…Iorns et al [22] developed a system to use a live streaming 360 • video as an input for image-based lighting where real-time shadowing and reflection were investigated. Michiels et al [23] used an appropriate 360 • environment map linked to a car position for real-time lighting added to the rendering equation.…”
Section: Related Workmentioning
confidence: 99%
“…Recently Michiels et al [34] demonstrate the usage of captured 360°video to render virtual objects into a reconstructed scene. Their panoramic video capturing hardware is fixed to a vehicle which is driven through a real life environment.…”
Section: Real-time Ibl Using Omnidirectional Videomentioning
confidence: 99%
“…There are however still several major drawbacks to this technique. The SRBF Figure 3.3: Results from [34]. Lack of HDR environment maps leads to inaccurate and thus somewhat unconvincing lighting, as can be seen with the lower right sphere appearing darker than the surroundings, and the upper right car appearing more blue than would be expected from looking at the environment.…”
Section: Real-time Ibl Using Omnidirectional Videomentioning
confidence: 99%