Image classification using deep transfer learning has received significant attention, benefiting from pre-trained with the large-scale annotation dataset and continuous improvement of neural network structure. In contrast to universal image classification, however, few publicly available datasets of maritime environments utilize deep transfer learning. Due to data-gathering effort and computational cost, the maritime datasets are deficient in the method of merging datasets and the benchmark of few-shot dataset classifiers. In this paper, we proposed the double transfer method, consisting of the merging datasets network and the backbone network, to address the problem. The merging datasets network measuring image similarity separates classes of known and unknown samples to reorganize a dataset, and the backbone network is constructed from the model EfficientNet-b5 by network-based deep transfer learning. Using the merging datasets network, we introduce the visible maritime image dataset, which has 3,750 images and twenty-five classes, including multitudinous maritime objects. The backbone networks evaluated and analyzed the dataset based on accuracy, precision, recall, and F-measure metrics. Using the double transfer method, we can achieve an accuracy of 91.39% in the visible maritime image dataset.