Physical human-robot interaction (pHRI) is reliant on human actions and can be addressed by studying human upperlimb motions during interactions. Use of force myography (FMG) signals, which detect muscle contractions, can be useful in developing machine learning algorithms as controls. In this paper, a novel long-term calibrated FMG-based trained model is presented to estimate applied force in dynamic motion during real-time interactions between a human and a linear robot. The proposed FMG-based pHRI framework was investigated in new, unseen, real-time scenarios for the first time. Initially, a long-term reference dataset (multiple source distributions) of upper-limb FMG data was generated as five participants interacted with the robot applying force in five different dynamic motions. Ten other participants interacted with the robot in two intended motions to evaluate the out-of-distribution (OOD) target data (new, unlearned), which was different than the population data. Two practical scenarios were considered for assessment: i) a participant applied force in a new, unlearned motion (scenario 1), and ii) a new, unlearned participant applied force in an intended motion (scenario 2). In each scenario, few long-term FMG-based models were trained using a baseline dataset [reference dataset (scenario 1, 2) and/or a learnt participant dataset (scenario 1)] and a calibration dataset (collected during evaluation). Real-time evaluation showed that the proposed long-term calibrated FMG-based models (LCFMG) could achieve estimation accuracies of 80%-94% in all scenarios. These results are useful towards integrating and generalizing human activity data in a robot control scheme by avoiding extensive HRI training phase in regular applications.