Human–robot physical interaction in shared object manipulation can fully leverage the strengths of both humans and collaborative robots, achieving better interaction outcomes. However, in typical physical interactions, the inertia parameters of the object are either assumed to be known or ignored, simplifying the interaction to be directly between the human and the collaborative robot. This simplification can lead to inaccuracies in the robot's understanding of human intent due to deviations in object dynamics. This article presents a method that eliminates the need for a force sensor on the human side during human–robot collaboration. Using an extended Kalman filter, the method estimates object parameters online and applies a disturbance observer to determine human force. This approach reduces human effort and improves tracking performance compared to traditional methods. It also avoids the hassle of installing and removing sensors in complex scenarios. By estimating object mass and human force in real time, the robot can adjust its behavior to ensure safer and smoother interactions. Furthermore, the system can handle various objects without requiring specific programming for each case.