Although several studies have used wearable sensors to analyze human lifting, this has generally only been done in a limited manner. In this proof-of-concept study, we investigate multiple aspects of offline lift characterization using wearable inertial measurement sensors: detecting the start and end of the lift and classifying the vertical movement of the object, the posture used, the weight of the object, and the asymmetry involved. In addition, the lift duration, horizontal distance from the lifter to the object, the vertical displacement of the object, and the asymmetric angle are computed as lift parameters. Twenty-four healthy participants performed two repetitions of 30 different main lifts each while wearing a commercial inertial measurement system. The data from these trials were used to develop, train, and evaluate the lift characterization algorithms presented. The lift detection algorithm had a start time error of 0.10 s ± 0.21 s and an end time error of 0.36 s ± 0.27 s across all 1489 lift trials with no missed lifts. For posture, asymmetry, vertical movement, and weight, our classifiers achieved accuracies of 96.8%, 98.3%, 97.3%, and 64.2%, respectively, for automatically detected lifts. The vertical height and displacement estimates were, on average, within 25 cm of the reference values. The horizontal distances measured for some lifts were quite different than expected (up to 14.5 cm), but were very consistent. Estimated asymmetry angles were similarly precise. In the future, these proof-of-concept offline algorithms can be expanded and improved to work in real-time. This would enable their use in applications such as real-time health monitoring and feedback for assistive devices.