It is evident that the prevailing solution, myoelectric pattern recognition for prosthetic manipulation, constrains gesture-based interaction due to the lack of proportional control information such as exerted force. This paper reports an attempt, named simultaneous gesture recognition and muscle contraction force estimation, to realize proportional pattern recognition (PPR) control taking advantage of arm muscle deformation via wearable ultrasound sensing. We experiment with eight types of predefined hand motions, with a range of 0 -60% maximum voluntary contraction (MVC) using a wearable multi-channel Amode ultrasound system. The experiment result demonstrates that above 93.7% of gestures are correctly recognized during dynamic muscle contraction forces (0 -60% MVC), albeit only training at a slight force level (<6% MVC). Besides, the adopted non-parametric Gaussian process regression estimates the muscle contraction forces accurately synchronously, with average coefficient of determination (R 2 ) and normalized root-mean-square error (nRMSE) of 0.927 and 0.102, respectively. These research outcomes demonstrate the feasibility of ultrasound-based PPR control, paving the way for musculature-driven applications including finer prosthetic control, remote manipulation and rehabilitation treatment.