The discriminative correlation filter (DCF)-based tracking method has shown good accuracy and efficiency in visual tracking. However, the periodic assumption of sample space causes unwanted boundary effects, restricting the tracker’s ability to distinguish between the target and background. Additionally, in the real tracking environment, interference factors such as occlusion, background clutter, and illumination changes cause response aberration and, thus, tracking failure. To address these issues, this work proposed a novel tracking method named the background-suppressed dual-regression correlation filter (BSDCF) for visual tracking. First, we utilize the background-suppressed function to crop out the target features from the global features. In the training step, while introducing the spatial regularity constraint and background response suppression regularization, we construct a dual regression structure to train the target and global filters separately. The aim is to exploit the difference between the output response maps for mutual constraint to highlight the target and suppress the background interference. Furthermore, in the detection step, the global response can be enhanced by a weighted fusion of the target response to further improve the tracking performance in complex scenes. Finally, extensive experiments are conducted on three public benchmarks (including OTB100, TC128, and UAVDT), and the experimental results indicate that the proposed BSDCF tracker achieves tracking performance comparable to many state-of-the-art (SOTA) trackers in a variety of complex situations.