Over the past decades, robots have been extensively deployed in multiple industries. More recently, industrial robots have been taken out of their cages, being more present in dynamic and uncertain environments, interacting in the close vicinity of human operators. Traditionally, robots have been mainly developed to perform pre-programmed tasks. However, some tasks are still too complex or expensive to be performed by a robotic system alone. An example of such hard task is the handling of large sheet-like objects as in the composite part production or plastic film wrapping. This work presents a hybrid wrench and vision reactive robot control approach towards the handling of large (non-)rigid materials. The presented approach fuses force-torque sensor data and skeleton tracking data from a camera to control a mobile manipulator in an intuitive manner, by using the intelligence of the operator as much as possible. Using this approach, high-level tools such as path planning, task planning, or the modeling of objects to be manipulated are not essential to obtain the results. The hybrid controller is subject to stability experiments where the controller responses are monitored when the mobile manipulator is subject to a step and sinusoidal function as input. Lastly, the overall approach is illustrated with a co-manipulation proof-of-concept task in which a flexible textile sheet is handled by a mobile manipulator and human operator together.