2019 24th Conference of Open Innovations Association (FRUCT) 2019
DOI: 10.23919/fruct.2019.8711930
|View full text |Cite
|
Sign up to set email alerts
|

Building Detection on Aerial Images Using U-NET Neural Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
18
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 35 publications
(18 citation statements)
references
References 3 publications
0
18
0
Order By: Relevance
“…To form an objective and comprehensive evaluation of the detection performance of oil tanks, we calculate several evaluation statistics based on the detected result images, referring to the ground truth oil tanks. In accordance with the evaluation strategy adopted in recent published work in object detection [24,[32][33][34], the precision, recall, F1-measure, and Intersection-Over-Union (IOU) are calculated to evaluate the detection performance of each method. As indicated in Equations ( 3)-( 6), Precision is the percentage of detected oil tank pixels that are ground truth oil tank, Recall is the percentage of ground truth oil tank pixels that are detected, F1-measure is calculated as a general performance indicator to balance precision and recall, IOU evaluates the performance concerning the overlap ratio between detected oil tank pixels and ground truth pixels.…”
Section: Experiments and Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…To form an objective and comprehensive evaluation of the detection performance of oil tanks, we calculate several evaluation statistics based on the detected result images, referring to the ground truth oil tanks. In accordance with the evaluation strategy adopted in recent published work in object detection [24,[32][33][34], the precision, recall, F1-measure, and Intersection-Over-Union (IOU) are calculated to evaluate the detection performance of each method. As indicated in Equations ( 3)-( 6), Precision is the percentage of detected oil tank pixels that are ground truth oil tank, Recall is the percentage of ground truth oil tank pixels that are detected, F1-measure is calculated as a general performance indicator to balance precision and recall, IOU evaluates the performance concerning the overlap ratio between detected oil tank pixels and ground truth pixels.…”
Section: Experiments and Resultsmentioning
confidence: 99%
“…In this paper, we propose an end-to-end deep neural network Res2-Unet+ to detect oil tanks with various shapes, sizes, and illumination conditions in a large-scale area based on high spatial resolution images. The network structure of Res2-Unet+ is modified from the typical network structure of Unet [23], which has been widely used for detecting various objects of interest based on remotely sensed images [24]. The main contributions of our paper are listed below:…”
Section: Introductionmentioning
confidence: 99%
“…The accuracy of the segmentation results is usually evaluated with different metrics, e.g. Sørensen-Dice coefficient (SDC) (Ivanovsky et al 2019, Ulku et al 2020, Chhor, Aramburu and Bougdal-Lambert 2017. SDC (also known as F1 score) algorithm is a statistic to assess the area of overlap or intersection over the total number of pixels in both images.…”
Section: Experiemental Results and Discussionmentioning
confidence: 99%
“…We used the U-net model architecture (Ronneberger et al, 2015). The model is a family of Convolutional Neural Network (CNN), which are FCN architectures (Long, Shelhamer, & Darrell, 2015) and was reported robust in many problems that need semantic segmentation, like medical (Ibtehaz & Rahman, 2020) and aerial (Ivanovsky, Khryashchev, Pavlov, & Ostrovskaya, 2019) image segmentation. The model mainly works with an encoder-decoder architecture where the contracting encoder extracts abstract features from an image while the expanding decoder block reconstructs segmented features (Ibtehaz & Rahman, 2020).…”
Section: The Model and Training Processmentioning
confidence: 99%