Objective. The absence of intuitive control in present myoelectric interfaces makes it a challenge for users to communicate with assistive devices efficiently in real-world conditions. This study aims to tackle this difficulty by incorporating neurophysiological entities, namely muscle and force synergies, onto multi-finger force estimation to allow intuitive myoelectric control. Approach. Eleven healthy subjects performed six isometric grasping tasks at three muscle contraction levels. The exerted fingertip forces were collected concurrently with the surface electromyographic (sEMG) signals from six extrinsic and intrinsic muscles of hand. Muscle synergies were then extracted from recorded sEMG signals, while force synergies were identified from measured force data. Afterwards, a linear regressor was trained to associate the two types of synergies. This would allow us to predict multi-finger forces simply by multiplying the activation signals derived from muscle synergies with the weighting matrix of initially identified force synergies. To mitigate the false activation of unintended fingers, the force predictions were finally corrected by a finger state recognition procedure. Main results. We found that five muscle synergies and four force synergies are able to make a tradeoff between the computation load and the prediction accuracy for the proposed model; When trained and tested on all six grasping tasks, our method (SYN-II) achieved better performance (R
2 = 0.80 ± 0.04, NRMSE = 0.19 ± 0.01) than conventional sEMG amplitude-based method; Interestingly, SYN-II performed better than all other methods when tested on two unknown tasks outside the four training tasks (R
2 = 0.74 ± 0.03, NRMSE = 0.22 ± 0.02), which indicated better generalization ability. Significance. This study shows the first attempt to link between muscle and force synergies to allow concurrent and continuous estimation of multi-finger forces from sEMG. The proposed approach may lay the foundation for high-performance myoelectric interfaces that allow users to control robotic hands in a more natural and intuitive manner.