Objective: Analyzing human motion is essential for diagnosing movement disorders and guiding rehabilitation for conditions like osteoarthritis, stroke, and Parkinson s disease. Optical motion capture systems are the standard for estimating kinematics, but the equipment is expensive and requires a predefined space. While wearable sensor systems can estimate kinematics in any environment, existing systems are generally less accurate than optical motion capture. Many wearable sensor systems require a computer in close proximity and use proprietary software, limiting experimental reproducibility. Methods: Here, we present OpenSenseRT, an open-source and wearable system that estimates upper and lower extremity kinematics in real time by using inertial measurement units and a portable microcontroller. Results: We compared the OpenSenseRT system to optical motion capture and found an average RMSE of 4.4 degrees across 5 lower-limb joint angles during three minutes of walking and an average RMSE of 5.6 degrees across 8 upper extremity joint angles during a Fugl-Meyer task. The open-source software and hardware are scalable, tracking 1 to 14 body segments, with one sensor per segment. A musculoskeletal model and inverse kinematics solver estimate Kinematics in real-time. The computation frequency depends on the number of tracked segments, but is sufficient for real-time measurement for many tasks of interest; for example, the system can track 7 segments at 30 Hz in real-time. The system uses off-the-shelf parts costing approximately $100 USD plus $20 for each tracked segment. Significance: The OpenSenseRT system is validated against optical motion capture, low-cost, and simple to replicate, enabling movement analysis in clinics, homes, and freeliving settings.