2021
DOI: 10.1080/13658816.2021.1873998
|View full text |Cite
|
Sign up to set email alerts
|

A hybrid approach to building simplification with an evaluator from a backpropagation neural network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
5
0
1

Year Published

2022
2022
2025
2025

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(10 citation statements)
references
References 31 publications
0
5
0
1
Order By: Relevance
“…They compared the performance of three network architectures (i.e., U-net, residual U-net, and generative adversarial network [GAN]) at different target map scales, and concluded that all the three networks have abilities to generate building maps at different scales. Yang et al (2021) proposed a supervised neural network model to identify the best simplified representation of buildings from the simplification candidates generated by exiting algorithms. In addition, there are some studies that use neural network models for line simplification.…”
Section: Rel Ated Workmentioning
confidence: 99%
See 1 more Smart Citation
“…They compared the performance of three network architectures (i.e., U-net, residual U-net, and generative adversarial network [GAN]) at different target map scales, and concluded that all the three networks have abilities to generate building maps at different scales. Yang et al (2021) proposed a supervised neural network model to identify the best simplified representation of buildings from the simplification candidates generated by exiting algorithms. In addition, there are some studies that use neural network models for line simplification.…”
Section: Rel Ated Workmentioning
confidence: 99%
“…In the map generalization domain, existing neural network‐based studies focus mainly on building generalization (Cheng et al, 2013; Feng et al, 2019; Sester et al, 2018; Touya et al, 2019; Yang et al, 2021). For example, Feng et al (2019) employed deep convolutional neural networks (DCNNs) for building generalization with raster images.…”
Section: Related Workmentioning
confidence: 99%
“…For a building, as Figure 12a shows, the tangent angle of an arbitrary point 𝑂 on its outline along a reference orientation (e.g., X-axis) is 𝜓, and the turning function 𝑓 𝑠 is defined as the change relationship of the tangent angle 𝜓 along its outline in a counterclockwise direction with respect to the arc length 𝑠 (as Figure 12b shows). The total arc length 𝑠 is 1 [42]. The 𝑆𝐷𝐶 between the original and simplified buildings is measured as follows [40]:…”
Section: Determination Of the Simplification Evaluation Indicatorsmentioning
confidence: 99%
“…In combined-based methods, different algorithms are combined to achieve the building simplification task. Considering numerous algorithms to simplify buildings, Yang et al (2021) [42] presented a hybrid approach that identifies the best simplified representation of a building among four existing algorithms to generate simplification candidates with a backpropagation neural network. Wei et al (2021) [8] proposed a combined building simplification approach based on local structure classification.…”
mentioning
confidence: 99%
“…Some studies have approached some pre-processing operations of generalization, e.g., structure recognition (Yan et al, 2019), building grouping (Yan et al, 2020), and shape coding (Yan et al, 2021), most of which are based on graph neural networks (GCNs). Recently, a classifier was trained using a backpropagation neural network to select the most appropriate conventional simplification algorithms for different shapes of buildings (Yang et al, 2022), but it still falls short of explicitly modeling the associated generalization operators, such as simplification, most importantly. Although in principle it has been shown that GCNs are capable of encoding vector map objects, it is still unclear how GCNs could work for end-to-end map generalization, that is, vector maps in and vector maps out.…”
mentioning
confidence: 99%