Olfaction and vision can play important roles in optimizing foraging decisions of birds, enabling them to maximize their net rate of energy intake while searching for, handling, and consuming food. Parrots have been used extensively in avian cognition research, and some species use olfactory cues to find food. Here we pioneered machine learning analysis and pose-estimation with convolutional neural networks (CNNs) to elucidate the relative importance of visual and olfactory cues for informing foraging decisions in the rosy-faced lovebird (Agapornis roseicollis) as a non-typical model species. In a binary choice experiment, we used markerless body pose tracking to analyse bird response behaviours. Rosy-faced lovebirds quickly learnt to discriminate the feeder provisioned with food by forming an association with visual (red/green papers) but not olfactory (banana/almond odour) cues. When visual cues indicated the provisioned and empty feeders, feeder choice was more successful, choice latency shorter, and interest in the empty feeder significantly lower. This demonstrates that visual cues alone are sufficient to inform lovebird foraging decisions without needing to use olfactory cues, suggesting that selection has not driven olfactory-based foraging in lovebird evolution.