Visual servo grasping technology has garnered significant attention in intelligent manufacturing for its potential to enhance both the flexibility and precision of robotic operations. However, traditional approaches frequently encounter challenges such as task failure when visual features move outside the camera’s field of view (FoV) and system instability due to interaction matrix singularities, limiting the technology’s effectiveness in complex environments. This study introduces a novel control strategy that leverages an asymmetric time-varying performance function to address the issue of visual feature escape. By strictly limiting the range of feature error, our approach ensures that visual features consistently remain within the camera’s FoV, thereby enhancing both transient and steady-state system performance. Furthermore, we have developed an adaptive damped least squares controller that dynamically adjusts the damping term to mitigate numerical instability resulting from interaction matrix singularities. The effectiveness of our method has been validated through grasping experiments involving significant rotations around the camera’s optical axis and other complex movements.