The determination of image orientation parameters of any sensor during data acquisition became possible by combined use of an inertial measurement system (IMU) and GPS. In this integrated system, GPS antenna, IMU and imaging sensor are located different position in airborne carrier. Because of this reason, the displacement vectors between sensors have to be determined. Similarly, axes of the IMU and imaging sensor are not same and a mis-orientation matrix exists between them. System calibration is including both calibration of individual sensor and calibration between sensors. The IMU calibration for drifts and biases and the calibration of imaging sensor for interior orientation parameter are components of sensor calibration. Calibration between sensors contains the determination of a constant displacement vector between sensors and a constant mis-orientation matrix between IMU body frame and imaging sensor frame. The boresight misalignment, the relation between the IMU and the imaging sensor is determined by bundle block adjustment using a calibration flight. The small change of correction of interior orientation and 3 shifts and 3 misalignment angles between IMU and imaging sensor directly affect direct sensor orientation.In this study, the effects of the system calibration on direct sensor orientation is investigated based on data set of the test 'Integrated Sensor Orientation' of the European Organization for Experimental Photogrammetric Research. For this, bundle block adjustments have been done with different approach using calibration flights. Using these bundle block adjustments, correction for interior orientation and 3 shifts and 3 misalignment angles between IMU and imaging sensor have been determined. The object coordinates of measured image points have been intersected based on GPS/IMU data improved by the boresight misalignment. For each approach, computed checkpoints coordinates have been compared with given reference coordinates. The effect of system calibration on direct sensor orientation has been analyzed comparing results of different georeferencing results using different system calibration parameters.
Georeferencing is one of the most important tasks in photogrammetry. Traditionally it has been achieved indirectly using the well‐known method of aerial triangulation. With the availability of integrated GPS and inertial measurement units (IMU), this situation changed. Direct determination of exterior orientation is now possible. Today, direct and integrated sensor orientation is used for a wide range of sensors including lidar and SAR, as well as for digital line scanner systems and aerial cameras. This paper investigates the performance of direct and integrated sensor orientation for large scale mapping using the data‐set of the ‘‘Integrated Sensor Orientation’’ test of the European Organisation for Experimental Photogrammetric Research (OEEPE—now known as EuroSDR). The concept, potential, problems and solutions of direct and integrated sensor orientation are discussed.
ABSTRACT:The collection and updating of 3D data is the one of the important steps for GIS applications which require fast and efficient data collection methods. The photogrammetry has been used for many years as a data collection method for GIS application in larger areas. The Unmanned Aerial Vehicles (UAV) Systems gained increasing attraction in geosciences for cost effective data capture and updating at high spatial and temporal resolution during the last years. These autonomously flying UAV systems are usually equipped with different sensors such as GPS receiver, microcomputers, gyroscopes and miniaturized sensor systems for navigation, positioning, and mapping purposes. The UAV systems can be used for data collection for digital elevation model DEM and orthoimages generation in GIS application at small areas. In this study, data collection and processing by light UAV system will be evaluated for GIS data capture and updating for small areas where not feasible for traditional photogrammetry. The main aim of this study is to design the low cost light UAV system for GIS data capture and update. The investigation was based on the aerial images which recorded during the flights performed with UAV system over the test site in Davutpasa Campus of Yildiz Technical University, Istanbul. The quality of generated DEM and ortho-images from UAV flights was discussed for GIS data capture and updating for small areas.
This paper presents an automatic building extraction approach using LiDAR data and aerial photographs from a multi-sensor system positioned at the same platform. The automatic building extraction approach consists of segmentation, analysis and classification steps based on object-based image analysis. The chessboard, contrast split and multi-resolution segmentation methods were used in the segmentation step. The determined object primitives in segmentation, such as scale parameter, shape, completeness, brightness, and statistical parameters, were used to determine threshold values for classification in the analysis step. The rule-based classification was carried out with defined decision rules based on determined object primitives and fuzzy rules. In this study, hierarchical classification was preferred. First, the vegetation and ground classes were generated; the building class was then extracted. The NDVI, slope and Hough images were generated and used to avoid confusing the building class with other classes. The intensity images generated from the LiDAR data and morphological operations were utilized to improve the accuracy of the building class. The proposed approach achieved an overall accuracy of approximately 93% for the target class in a suburban neighborhood, which was the study area. Moreover, completeness (96.73%) and correctness (95.02%) analyses were performed by comparing the automatically extracted buildings and reference data. Keywords: LiDAR; Building Extraction; Hough; NDVI; Segmentation; Classification.Automatic building extraction using LiDAR and aerial photographs.Bol. Ciênc. Geod., sec. Artigos, Curitiba, v. 19, n o 2, p.153-171, abr-jun, 2013.1 5 4 RESUMO Este artigo apresenta uma abordagem para a extração automática de edificações usando dados LiDAR e fotografias aéreas de um sistema com múltiplos sensores posicionados na mesma plataforma. A abordagem de extração automática de edificações é composta por etapas de segmentação, análise e classificação, baseadas em análise de imagens com base em objetos. Na etapa de segmentação foram usados os métodos Chessboard, fatiamento do constraste e multirresolução. As primitivas de segmentação, como escala, forma, integridade, brilho e parâmetros estatísticos, foram usadas para determinar os valores-limite para a classificação na etapa de análise. A classificação baseada em regras foi realizada com regras de decisão definidos com base nas primitivas de determinado objeto e regras fuzzy. Neste estudo, preferiu-se a classificação hierárquica. Primeiramente, foram geradas as classes de vegetação e solo e então foi extraída a classe de edifícações. O NDVI, declividade, e as imagens Hough foram gerados e usados para evitar confundir a classe edificações com outras classes. As imagens de intensidade geradas a partir dos dados LiDAR e operações morfológicas foram utilizados para melhorar a precisão da classe de edifícações. A abordagem proposta alcançou uma exatidão de aproximadamente 93% para a classe alvo em um bairro suburbano, que era a área de ...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.