Purpose We present a fully image-based visual servoing framework for neurosurgical navigation and needle guidance. The proposed servo-control scheme allows for compensation of target anatomy movements, maintaining high navigational accuracy over time, and automatic needle guide alignment for accurate manual insertions. Method Our system comprises a motorized 3D ultrasound (US) transducer mounted on a robotic arm and equipped with a needle guide. It continuously registers US sweeps in real time with a pre-interventional plan based on CT or MR images and annotations. While a visual control law maintains anatomy visibility and alignment of the needle guide, a force controller is employed for acoustic coupling and tissue pressure. We validate the servoing capabilities of our method on a geometric gel phantom and real human anatomy, and the needle targeting accuracy using CT images on a lumbar spine gel phantom under neurosurgery conditions. Results Despite the varying resolution of the acquired Oliver Zettinig and Benjamin Frisch have contributed equally to this work. Yu-Mi Ryang and Nassir Navab have contributed equally to this work.B Oliver Zettinig 3D sweeps, we achieved direction-independent positioning errors of 0.35±0.19 mm and 0.61 • ±0.45 • , respectively. Our method is capable of compensating movements of around 25 mm/s and works reliably on human anatomy with errors of 1.45 ± 0.78 mm. In all four manual insertions by an expert surgeon, a needle could be successfully inserted into the facet joint, with an estimated targeting accuracy of 1.33±0.33 mm, superior to the gold standard. Conclusion The experiments demonstrated the feasibility of robotic ultrasound-based navigation and needle guidance for neurosurgical applications such as lumbar spine injections.