Urban Air Mobility (UAM) emerges as a transformative approach to address urban congestion and pollution, offering efficient and sustainable transportation for people and goods. Central to UAM is the Operational Digital Twin (ODT), which plays a crucial role in real-time management of air traffic, enhancing safety and efficiency. This study introduces a YOLOTransfer-DT framework specifically designed for Artificial Intelligence (AI) training in simulated environments, focusing on its utility for experiential learning in realistic scenarios. The framework’s objective is to augment AI training, particularly in developing an object detection system that employs visual tasks for proactive conflict identification and mission support, leveraging deep and transfer learning techniques. The proposed methodology combines real-time detection, transfer learning, and a novel mix-up process for environmental data extraction, tested rigorously in realistic simulations. Findings validate the use of existing deep learning models for real-time object recognition in similar conditions. This research underscores the value of the ODT framework in bridging the gap between virtual and actual environments, highlighting the safety and cost-effectiveness of virtual testing. This adaptable framework facilitates extensive experimentation and training, demonstrating its potential as a foundation for advanced detection techniques in UAM.