2020
DOI: 10.1080/01431161.2020.1737339
|View full text |Cite
|
Sign up to set email alerts
|

Outlier detection and robust plane fitting for building roof extraction from LiDAR data

Abstract: Individual roof plane extraction from LiDAR point-cloud data is a complex and difficult task because of unknown semantic characteristics and inharmonious behaviour of input data. Most of the existing state-of-the-art methods fail to detect small true roof planes with exact boundaries due to outliers, occlusions, complex building structures and other inconsistent nature of LiDAR data. In this paper, we have presented an improved building detection and roof plane extraction method, which is less sensitive to the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2020
2020
2025
2025

Publication Types

Select...
8

Relationship

2
6

Authors

Journals

citations
Cited by 24 publications
(34 citation statements)
references
References 46 publications
0
34
0
Order By: Relevance
“…13 shows these selected buildings with individual reference roof plane boundaries in blue polygons. We used the method proposed by Dey et al [53] to extract the individual roof planes and apply the RCC metric to evaluate the extracted planes. The colour bar indicates the distance estimation davg by the RCC metric and is scaled from the largest (red) to the smallest (green).…”
Section: B Experimental Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…13 shows these selected buildings with individual reference roof plane boundaries in blue polygons. We used the method proposed by Dey et al [53] to extract the individual roof planes and apply the RCC metric to evaluate the extracted planes. The colour bar indicates the distance estimation davg by the RCC metric and is scaled from the largest (red) to the smallest (green).…”
Section: B Experimental Resultsmentioning
confidence: 99%
“…The buildings with area less than 5 m 2 were not considered for evaluation in our experiments. We follow the method proposed by Dey et al [53] for building boundary extraction. Fig.…”
Section: A Data Setsmentioning
confidence: 99%
“…To relieve these problems, refs. [17][18][19][20][21][22][23][24] combined optical imagery with GIS data, digital surface models (DSMs) obtained from light detection and ranging (lidar), or synthetic aperture radar interferometry to distinguish non-buildings that are highly similar to buildings, increasing the robustness of building extraction, are used. However, obtaining a wide range of corresponding multisource data is always costly.…”
Section: Related Workmentioning
confidence: 99%
“…Estimation of the feature points and lines is a fundamental problem in the field of image and shape analysis as this estimation facilitates better understanding of an object in a variety of areas, e.g., data registration [1], data simplification [2], road extraction [3], and building reconstruction [4]. Specifically, the area of 3D building reconstruction has a broad range of applications, such as building type classification, urban planning, solar potential estimation, change detection, forest management, and virtual tours [5][6][7][8][9][10]. Due to the availability of 3D point cloud data, from both airborne and ground-based mobile laser scanning systems, the extraction of 3D feature points and lines from point cloud data has become an attractive research topic to describe an object shape more accurately.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, LiDAR data provide more accurate geometric information than the 2D image and are used as the main input data for automatic building reconstruction [12,13]. The reconstruction approaches from LiDAR data can be broadly categorised into two: model-driven and data-driven [7]. The first approach finds the most similar models previously stored in the database to the input data, whereas the second approach tries to generate any building model from the provided 3D data.…”
Section: Introductionmentioning
confidence: 99%