For individuals affected by Autism Spectrum Disorder (ASD), the inability to make eye contact is a significant barrier to their engagement in social environments. This lack of eye contact limits their ability to read social and emotional cues exhibited through facial expressions resulting in a corresponding decrease in social engagement. The use of interactive virtual environments (VEs) as a therapeutic protocol is a growing field of study. In recent studies, individuals with ASD were placed in VEs and engaged with avatars controlled by a human in the background, resulting in improvements in eye contact and engagement for some subjects. This paper is the first in a series of experiments exploring the potential of virtual avatars controlled through software agency, rather than human control, as a therapeutic tool for ASD. This paper examines if a subject could learn to make eye contact with an avatar and consequently recognize and respond to emotional cues expressed by the avatar. Results indicate that children with ASD can learn to recognize the emotional cues of the virtual avatar, and that their reactions to the avatar's needs as well as their eye contact with the avatar improved over the course of the VE experiment. This study sets the stage for future exploration into therapeutic use of agent-based virtual avatars, including transference of emotional cues from avatars to humans in the real world.