2019
DOI: 10.3788/aos201939.1228002
|View full text |Cite
|
Sign up to set email alerts
|

Remote Sensing Building Detection Based on Binarized Semantic Segmentation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…With the rise of artificial intelligence technology, convolutional neural networks can effectively remove the interference between the ground object information of remote sensing images and effectively extract target features, so they are widely used in the field of remote sensing image building semantic segmentation. Zhu Tianyou [4] et al proposed a semantic segmentation network MBU-Net based on the mixed-use of binary and floating-point numbers to detect buildings in remote sensing images in real-time. Liu Hao [5] et al improved the UNet network and used the combination of Dice coefficient and cross-entropy function as the loss function, and achieved good results in the extraction of irregular buildings.…”
Section: Introductionmentioning
confidence: 99%
“…With the rise of artificial intelligence technology, convolutional neural networks can effectively remove the interference between the ground object information of remote sensing images and effectively extract target features, so they are widely used in the field of remote sensing image building semantic segmentation. Zhu Tianyou [4] et al proposed a semantic segmentation network MBU-Net based on the mixed-use of binary and floating-point numbers to detect buildings in remote sensing images in real-time. Liu Hao [5] et al improved the UNet network and used the combination of Dice coefficient and cross-entropy function as the loss function, and achieved good results in the extraction of irregular buildings.…”
Section: Introductionmentioning
confidence: 99%