In this paper, a Kinect-based distributed and real-time motion capture system is developed. A trigonometric method is applied to calculate the relative positions of Kinect v2 sensors with a calibration wand and register the sensors' positions automatically. By combining results from multiple sensors with a nonlinear least square method, the accuracy of motion capture is optimized. Moreover, to exclude inaccurate results from sensors, a computational geometry is applied in the occlusion approach to discover occluded joint data. The synchronization approach is based on the NTP protocol, which synchronizes the time between the clocks of a server and of clients dynamically, leading to the proposed system being real time. Experiments to validate the proposed system are conducted from the perspective of calibration, occlusion, and accuracy. More specifically, the mean absolute error of the calibration results is 0.73 cm, the proposed occlusion method is tested on upper and lower limbs, and the synchronization component guarantees the clock synchronization and real-time performance for more than 99% of the measurement process.Furthermore, to demonstrate the practical performance of our system, a comparison with previously developed motion capture systems (the linear trilateration approach [52] and the geometric trilateration approach [51]) with the benchmark OptiTrack system is performed for the tracked joints of the head, shoulder, elbow, and wrist, therein showing that the accuracy of our proposed system is 38.3% and 24.1% better than the aforementioned two trilateration systems. Quantitative analysis is also conducted on our proposed system with the commercial inertial motion capture system Delsys smart sensor system by comparing the measurements of lower limbs (i.e., hips, knees, and ankles), and the standard deviation of our proposed system's measurement results is 4.92 cm.