Interaction between humans and unmanned aerial vehicles is a promising field for future applications. However, current interfacing paradigms either imply the presence of intermediary hardware as monitors, joysticks and haptic devices, or are limited to visual/auditory channels with hand gestures, voice recognition, or interpretation of face poses and body postures. Another paradigm, physical human–robot interaction, which is based on mutual exchange of forces, is popular when dealing with robotic arms and humanoids, while unmanned aerial vehicles are usually considered too dangerous and lack proper interaction surfaces to exchange forces. In this paper, we address the problem of physical human–unmanned aerial vehicle interaction and we propose a straightforward approach to allow a human to intuitively command an unmanned aerial vehicle through exchanges of forces. Using a residual based estimator, we estimate the external forces and torques acting on the unmanned aerial vehicle. Through the employment of a sensor ring, we are able to separate the human interaction forces from additional disturbances as wind and parameter uncertainties. This knowledge is used inside a control framework where the human is allowed to change the desired trajectory by simply applying forces on the unmanned aerial vehicle. The system is validated with multiple hardware-in-the-loop simulations and experiments in which we try different interaction modalities.