Ecological remote sensing is being transformed by three-dimensional (3D), multispectral measurements of forest canopies by unmanned aerial vehicles (UAV) and computer vision structure from motion (SFM) algorithms. Yet applications of this technology have out-paced understanding of the relationship between collection method and data quality. Here, UAV-SFM remote sensing was used to produce 3D multispectral point clouds of Temperate Deciduous forests at different levels of UAV altitude, image overlap, weather, and image processing. Error in canopy height estimates was explained by the alignment of the canopy height model to the digital terrain model (R 2 = 0.81) due to differences in lighting and image overlap. Accounting for this, no significant differences were observed in height error at different levels of lighting, altitude, and side overlap. Overall, accurate estimates of canopy height compared to field measurements (R 2 = 0.86, RMSE = 3.6 m) and LIDAR (R 2 = 0.99, RMSE = 3.0 m) were obtained under optimal conditions of clear lighting and high image overlap (>80%). Variation in point cloud quality appeared related to the behavior of SFM 'image features'. Future research should consider the role of image features as the fundamental unit of SFM remote sensing, akin to the pixel of optical imaging and the laser pulse of LIDAR.
We present a new algorithm for appearance-preserving simplification. Not only does it generate a low-polygon-count approximation of a model, but it also preserves the appearance. This is accomplished for a particular display resolution in the sense that we properly sample the surface position, curvature, and color attributes of the input surface. We convert the input surface to a representation that decouples the sampling of these three attributes, storing the colors and normals in texture and normal maps, respectively. Our simplification algorithm employs a new texture deviation metric, which guarantees that these maps shift by no more than a user-specified number of pixels on the screen. The simplification process filters the surface position, while the runtime system filters the colors and normals on a per-pixel basis. We have applied our simplification technique to several large models achieving significant amounts of simplification with little or no loss in rendering quality.
Abstract-We present Glimmer, a new multilevel algorithm for multidimensional scaling designed to exploit modern graphics processing unit (GPU) hardware. We also present GPU-SF, a parallel, force-based subsystem used by Glimmer. Glimmer organizes input into a hierarchy of levels and recursively applies GPU-SF to combine and refine the levels. The multilevel nature of the algorithm makes local minima less likely while the GPU parallelism improves speed of computation. We propose a robust termination condition for GPU-SF based on a filtered approximation of the normalized stress function. We demonstrate the benefits of Glimmer in terms of speed, normalized stress, and visual quality against several previous algorithms for a range of synthetic and real benchmark datasets. We also show that the performance of Glimmer on GPUs is substantially faster than a CPU implementation of the same algorithm.
High quality, physically accurate rendering at interactive rates has widespread application, but is a daunting task. We attempt to bridge the gap between high-quality offline and interactive rendering by using existing environment mapping hardware in combination with a novel Image Based Rendering (IBR) algorithm. The primary contribution lies in performing IBR in reflection space. This method can be applied to ordinary environment maps, but for more physically accurate rendering, we apply reflection space IBR to radiance environment maps. A radiance environment map pre-integrates a Bidirectional Reflection Distribution Function (BRDF) with a lighting environment. Using the reflection-space IBR algorithm on radiance environment maps allows interactive rendering of arbitrary objects with a large class of complex BRDFs in arbitrary lighting environments. The ultimate simplicity of the final algorithm suggests that it will be widely and immediately valuable given the ready availability of hardware assisted environment mapping.
Programmable shading is a common technique for production animation, but interactive programmable shading is not yet widely available. We support interactive programmable shading on virtually any 3D graphics hardware using a scene graph library on top of OpenGL. We treat the OpenGL architecture as a general SIMD computer, and translate the high-level shading description into OpenGL rendering passes. While our system uses OpenGL, the techniques described are applicable to any retained mode interface with appropriate extension mechanisms and hardware API with provisions for recirculating data through the graphics pipeline.We present two demonstrations of the method. The first is a constrained shading language that runs on graphics hardware supporting OpenGL 1.2 with a subset of the ARB imaging extensions. We remove the shading language constraints by minimally extending OpenGL. The key extensions are color range (supporting extended range and precision data types) and pixel texture (using framebuffer values as indices into texture maps). Our second demonstration is a renderer supporting the RenderMan Interface and RenderMan Shading Language on a software implementation of this extended OpenGL. For both languages, our compiler technology can take advantage of extensions and performance characteristics unique to any particular graphics hardware.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.