Hashing-based cross-modal search which aims to map multiple modality features into binary codes has attracted increasingly attention due to its storage and search efficiency especially in largescale database retrieval. Recent unsupervised deep cross-modal hashing methods have shown promising results. However, existing approaches typically suffer from two limitations: (1) They usually learn cross-modal similarity information separately or in a redundant fusion manner, which may fail to capture semantic correlations among instances from different modalities sufficiently and effectively. (2) They seldom consider the sampling and weighting schemes for unsupervised cross-modal hashing, resulting in the lack of satisfactory discriminative ability in hash codes. To overcome these limitations, we propose a novel unsupervised deep cross-modal hashing method called Joint-modal Distributionbased Similarity Hashing (JDSH) for large-scale cross-modal retrieval. First, we propose a novel cross-modal joint-training method by constructing a joint-modal similarity matrix to fully preserve the cross-modal semantic correlations among instances. Second, we propose a sampling and weighting scheme termed the Distributionbased Similarity Decision and Weighting (DSDW) method for unsupervised cross-modal hashing, which is able to generate more discriminative hash codes by pushing semantic similar instance pairs closer and pulling semantic dissimilar instance pairs apart. The experimental results demonstrate the superiority of JDSH compared with several unsupervised cross-modal hashing methods on two public datasets NUS-WIDE and MIRFlickr.
Underwater image enhancement has drawn much attention due to the significance of underwater vision. Although considerable progress has been made in this field, a key problem remains unsolved: how can we extract and enhance minutiae while trying to remove the noise caused by scattering and attenuation? To address this limitation, we propose a new underwater image enhancement technique with a novel spiral generative adversarial framework, named Spiral-GAN, which can effectively recover real-world underwater images with more details, vivid colors and better contrast. For steady training and color correction, we include the pixel-wise losses that consist of a mean squared error and an angle error in our objective function. In addition, we design our generator with several deconv-conv blocks to preserve the details from the original distorted images. Furthermore, we present a spiral learning strategy for generalizing the enhancing model to effectively recover the real-world underwater images. Finally, we perform a number of qualitative and quantitative evaluations that suggest that our proposed approach can efficiently enhance the quality of underwater images, which can be further used for underwater object detection. INDEX TERMS Underwater image enhancement, Generative adversarial networks, Spiral training.
An improved deep learning neural model YOLOv5-DN based on YOLOv5 is proposed for marine ship detection and classification in the area of harbours and heavy traffic waterways. The CSP-DarkNet module in YOLOv5 is replaced by CSP-DenseNet to promote the accuracy of target detection and classification in the proposed model. Sample marine ships in the data set are divided into six classes: ore carriers, general cargo ships, bulk cargo ships, container ships, passenger ships, and fishing ships to meet the detection needs in the areas of ports and waterways. The data set are grouped into a training set, testing set, and validating set by the proportion of 6:2:2. Experiments show that the improved model has better average accuracy, from 62.2% to 71.6%.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.