A hands-free (HF) lean-to-steer control concept that uses torso motions is demonstrated by navigating a virtual robotic mobility device based on a ball-based robotic (ballbot) wheelchair. A custom sensor system (i.e., Torso-dynamics Estimation System (TES)) was utilized to measure and convert the dynamics of the rider's torso motions into commands to provide HF control of the robot. A simulation study was conducted to explore the efficacy of the HF controller compared to a traditional joystick (JS) controller, and whether there were differences in performance by manual wheelchair users (mWCUs), who may have reduced torso function, compared to able-bodied users (ABUs). Twenty test subjects (10 mWCUs + 10 ABUs) used the subject-specific adjusted TES while wearing a virtual reality headset and were asked to navigate a virtual human rider on the ballbot through obstacle courses replicating seven indoor environment zones. Repeated measures MANOVA tests assessed performance metrics representing efficiency (i.e., number of collisions), effectiveness (i.e., completion time), comfort (i.e., NASA TLX scores), and robustness (i.e., index of performance). As expected, more challenging zones took longer to complete and resulted in more collisions. An interaction effect was observed such that ABUs had significantly more collisions using JS vs. HF control, while mWCUs had little difference with either interface. All subjects reported greater physical demand was needed for HF control than JS control; although, no users visibly showed or expressed fatigue or exhaustion when using HF control. In general, HF control performed as well as JS control, and mWCUs performed similarly to ABUs.