This paper presents a novel fusion of low-level approaches for dimensionality reduction into an effective approach for high-level objects in neuromorphic camera data called Inceptive Event Time-Surfaces (IETS). IETSs overcome several limitations of conventional time-surfaces by increasing robustness to noise, promoting spatial consistency, and improving the temporal localization of (moving) edges. Combining IETS with transfer learning improves state-of-the-art performance on the challenging problem of object classification utilizing event camera data.
In this work, we assess the detection and classification of specially constructed targets in coincident airborne hyperspectral imagery (HSI) and high spatial resolution panchromatic imagery (HRI) in spectral, spatial, and joint spatial-spectral feature spaces. The target discrimination powers of the data-level and feature-level fusion of HSI and HRI are also directly compared in the spatial-spectral context using airborne imagery collected explicitly for this research. We show that in the case of Bobcat 2013 imagery, feature-level fusion of the HSI spectrum with spatial features derived from the coincident HRI data consistently results in fewer false alarms on the scene background as well as fewer misclassifications among the tested targets. Furthermore, this approach also outperforms schemes in which data-level fusion of the HSI and HRI imagery is performed prior to extracting spatial-spectral features.Index Terms-Hyperspectral imagery (HSI), image fusion, material identification, pansharpening, spatial-spectral feature extraction, target identification.
The amount of hyperspectral imagery (HSI) data currently available is relatively small compared to other imaging modalities, and what is suitable for developing, testing, and evaluating spatial-spectral algorithms is virtually nonexistent. In this work, a significant amount of coincident airborne hyperspectral and high spatial resolution panchromatic imagery that supports the advancement of spatial-spectral feature extraction algorithms was collected to address this need. The imagery was collected in April 2013 for Ohio University by the Civil Air Patrol, with their Airborne Real-time Cueing Hyperspectral Enhanced Reconnaissance (ARCHER) sensor. The target materials, shapes, and movements throughout the collection area were chosen such that evaluation of change detection algorithms, atmospheric compensation techniques, image fusion methods, and material detection and identification algorithms is possible. This paper describes the collection plan, data acquisition, and initial analysis of the collected imagery.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.