Animal behavior emerges from a seamless interaction between musculoskeletal elements, neural network dynamics, and the environment. Accessing and understanding the interplay between these intertwined systems requires the development of integrative neuromechanical simulations. Until now, there has been no such simulation framework for the widely studied model organism, Drosophila melanogaster. Here we present NeuroMechFly, a data-driven computational model of an adult female fly that is designed to synthesize rapidly growing experimental datasets and to test theories of neuromechanical behavioral control. NeuroMechFly combines a set of modules including an exoskeleton with articulating body parts---limbs, halteres, wings, abdominal segments, head, proboscis, and antennae---muscle models, and neural networks within a physics-based simulation environment. Using this computational framework, we (i) predict the minimal limb degrees-of-freedom needed for real Drosophila behaviors, (ii) estimate expected contact reaction forces, torques, and tactile signals during replayed Drosophila walking and grooming, and (iii) discover neural network and muscle parameters that can drive tripod walking. Thus, NeuroMechFly is a powerful testbed for building an understanding of how behaviors emerge from interactions between complex neuromechanical systems and their physical surroundings.