Structural disaster damage detection and characterization is one of the oldest remote sensing challenges, and the utility of virtually every type of active and passive sensor deployed on various air- and spaceborne platforms has been assessed. The proliferation and growing sophistication of unmanned aerial vehicles (UAVs) in recent years has opened up many new opportunities for damage mapping, due to the high spatial resolution, the resulting stereo images and derivatives, and the flexibility of the platform. This study provides a comprehensive review of how UAV-based damage mapping has evolved from providing simple descriptive overviews of a disaster science, to more sophisticated texture and segmentation-based approaches, and finally to studies using advanced deep learning approaches, as well as multi-temporal and multi-perspective imagery to provide comprehensive damage descriptions. The paper further reviews studies on the utility of the developed mapping strategies and image processing pipelines for first responders, focusing especially on outcomes of two recent European research projects, RECONASS (Reconstruction and Recovery Planning: Rapid and Continuously Updated Construction Damage, and Related Needs Assessment) and INACHUS (Technological and Methodological Solutions for Integrated Wide Area Situation Awareness and Survivor Localization to Support Search and Rescue Teams). Finally, recent and emerging developments are reviewed, such as recent improvements in machine learning, increasing mapping autonomy, damage mapping in interior, GPS-denied environments, the utility of UAVs for infrastructure mapping and maintenance, as well as the emergence of UAVs with robotic abilities.
Automatic post-disaster mapping of building damage using remote sensing images is an important and time-critical element of disaster management. The characteristics of remote sensing images available immediately after the disaster are not certain, since they may vary in terms of capturing platform, sensor-view, image scale, and scene complexity. Therefore, a generalized method for damage detection that is impervious to the mentioned image characteristics is desirable. This study aims to develop a method to perform grid-level damage classification of remote sensing images by detecting the damage corresponding to debris, rubble piles, and heavy spalling within a defined grid, regardless of the aforementioned image characteristics. The Visual-Bag-of-Words (BoW) is one of the most widely used and proven frameworks for image classification in the field of computer vision. The framework adopts a kind of feature representation strategy that has been shown to be more efficient for image classification-regardless of the scale and clutter-than conventional global feature representations. In this study supervised models using various radiometric descriptors (histogram of gradient orientations (HoG) and Gabor wavelets) and classifiers (SVM, Random Forests, and Adaboost) were developed for damage classification based on both BoW and conventional global feature representations, and tested with four datasets. Those vary according to the aforementioned image characteristics. The BoW framework outperformed conventional global feature representation approaches in all scenarios (i.e., for all combinations of feature descriptors, classifiers, and datasets), and produced an average accuracy of approximately 90%. Particularly encouraging was an accuracy improvement by 14% (from 77% to 91%) produced by BoW over global representation for the most complex dataset, which was used to test the generalization capability.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.