Synthetic aperture radar (SAR) multi‐target interactive motion recognition classifies the type of interactive motion and generates descriptions of the interactive motions at the semantic level by considering the relevance of multi‐target motions. A method for SAR multi‐target interactive motion recognition is proposed, which includes moving target detection, target type recognition, interactive motion feature extraction, and multi‐target interactive motion type recognition. Wavelet thresholding denoising combined with a convolutional neural network (CNN) is proposed for target type recognition. The method performs wavelet thresholding denoising on SAR target images and then uses an eight‐layer CNN named EilNet to achieve target recognition. After target type recognition, a multi‐target interactive motion type recognition method is proposed. A motion feature matrix is constructed for recognition and a four‐layer CNN named FolNet is designed to perform interactive motion type recognition. A motion simulation dataset based on the MSTAR dataset is built, which includes four kinds of interactive motions by two moving targets. The experimental results show that the recognition performance of the authors’ Wavelet + EilNet method for target type recognition and FolNet for multi‐target interactive motion type recognition are both better than other methods. Thus, the proposed method is an effective method for SAR multi‐target interactive motion recognition.
Clouds, cloud shadows (CCS), and numerous other factors will cause a missing data problem in passive remote sensing images. A well-known reconstruction method is the selection of a similar pixel (with an additional clear reference image) from the remaining clear part of an image to replace the missing pixel. Due to the merit of filling the missing value using a pixel acquired on the same image with the same sensor and the same date, this method is suitable for time-series applications when a time-series profile-based similar measure is utilized for selecting the similar pixel. Since the similar pixel is independently selected, the improper reference pixel or various accuracies obtained by different land covers causes the problem of salt-and-pepper noise in the reconstructed part of an image. To overcome these problems, this paper presents a spectral–temporal patch (STP)-based missing area reconstruction method for time-series images. First, the STP, the pixels of which have similar spectral and temporal evolution characteristics, is extracted using multi-temporal image segmentation. However, some STP have Missing Observations (STPMO) in the time series, which should be reconstructed. Next, for an STPMO, the most similar STP is selected as the reference STP; then, the mean and standard deviation of the STPMO is predicted using a linear regression method with the reference STP. Finally, the textural information, which is denoted by the spatial configuration of color or intensities of neighboring pixels, is extracted from the clear temporal-adjacent STP and “injected” into the missing area to obtain synthetic cloud-free images. We performed an STP-based missing area reconstruction experiment in Jiangzhou, Chongzuo, Guangxi with time-series images acquired by wide field view (WFV) onboard Chinese Gao Fen 1 on 12 different dates. The results indicate that the proposed method can effectively recover the missing information without salt-and-pepper noise in the reconstructed area; also, the reconstructed part of the image is consistent with the clear part without a false edge. The results confirm that the spectral information from the remaining clear part of the same image and textural information from the temporal-adjacent image can create seamless time-series images.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.