. Though the open-air tracker has achieved an advanced level, its design is still a challenging task for degraded underwater images. Using underwater enhancement technology can improve the performance of underwater trackers. However, most underwater image enhancement methods focus on improving the visual effect rather than serving the tracker better. Therefore, we intended to explore a simple but powerful image domain-adaptive method to improve Stark’s performance by enhancing Stark’s input images. Specifically, it consists of underwater image adaptation network (UIAN) with double heads and adaptation block based on scene estimation (ABSE) that consists of three independent image processing modules without deep learning. UIAN is used to predict the category of image domain and parameters of ABSE. ABSE decodes the parameters and sequentially process underwater images in each module. The training of UIAN is independent of the training of the tracker. After training the class prediction head of UIAN first, freezing its weights, by initializing and tracking in one enhanced image and computing tracker’s loss, the parameter head can be trained to make sure UIANs hyperparameters can match the Stark tracker. The UStark proposed can adaptively process clear and degraded underwater images. Compared with Stark, UStark has improved the accuracy and success rate in typical underwater environments by 3.7% and 1.5% (blue), 5% and 3.4% (yellow), and 5.4% and 3.3% (dark), respectively. In addition, compared with other underwater image enhancement methods, our method can enhance the performance of the tracker in more categories of underwater images.
Visual tracker includes network and post-processing. Despite the color distortion and low contrast of underwater images, advanced trackers can still be very competitive in underwater object tracking because deep learning empowers the networks to discriminate the appearance features of the target. However, underwater object tracking also faces another problem. Underwater targets such as fish and dolphins, usually appear in groups, and creatures of the same species usually have similar expressions of appearance features, so it is challenging to distinguish the weak differences characteristics only by the network itself. The existing detection-based post-processing only reflects the results of single frame detection, but cannot locate real targets among similar targets. In this paper, we propose a new post-processing strategy based on motion, which uses Kalman filter (KF) to maintain the motion information of the target and exclude similar targets around. Specifically, we use the KF predicted box and the candidate boxes in the response map and their confidence to calculate the candidate location score to find the real target. Our method does not change the network structure, nor does it perform additional training for the tracker. It can be quickly applied to other tracking fields with similar target problem. We improved SOTA trackers based on our method, and proved the effectiveness of our method on UOT100 and UTB180. The AUC of our method for OSTrack on similar subsequences is improved by more than 3% on average, and the precision and normalization precision are improved by more than 3.5% on average. It has been proved that our method has good compatibility in dealing with similar target problems and can enhance performance of the tracker together with other methods. More details can be found in: https://github.com/LiYunfengLYF/KF_in_underwater_trackers.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.