This thesis proposes a series of user evaluations of spatialized sonification methods rendered as AR in simulated and real-life scenarios. It proposes and promotes next-generation micro-guidance methods for low-visibility and vision-impaired (VI) scenarios. In 2D hand-guidance, results (N=47) outlined that sound spatiality methods had the most promising performance in time taken and distance from target. When assessing vertical hand-guidance in a 3D task (N=19), results indicated a significantly higher accuracy for a novel height-to-pitch method. Finally, a significant disparity was found between VI (N=20) and sighted (N=77) people regarding sighted people’s empathy with the VI community. After an AR blindness embodiment experience, sighted people’s (N=15) empathetic and sympathetic responses towards said community significantly increased. Ultimately, this thesis evaluates how audio AR can help users to have accurate and safe performances in day-to-day manual tasks.