This paper attempts to resolve the issue of impedance levels and Centre-of-Mass (CoM) position, for the purpose of maintaining balance, by means of receiving visual feedback from its environment. Once it has been endowed with visual perception, the robot is rendered capable of ascertaining whether a moving object located within fair proximity to its structure could potentially menace its state of balance, based on its relative distance and velocity. The real-time detection of such an event then necessitates the performance of a preparative action that would timely balancing. However, the latter may be viewed as a twofold problem in this context, since in addition to triggering a motion aimed at transferring the CoP towards the side of the polygon at which an external impact is imminent, needs to vary appropriately in order to absorb this disturbance. Experimental results obtained using a depth ASUS Xtion PRO Live camera attached to the COmpliant huMANoid (COMAN), clearly demonstrate the balance augmentation that is achievable when incorporating visual feedback into a balancing controller, thus partially corroborating the generic hypothesis that visual perception could potentially pave the way for a bipedal -like balancing.