Labanotation is a widely used notation system for recording body movements, especially dances. It has wide applications in choreography preservation, dance archiving, and so on. However, the manual creation of Labanotation scores is rather difficult and time-consuming. Therefore, research on the generation of Labanotation scores is of great interest. In this paper, we aim to generate Labanotation scores based on the motion-captured data obtained from real-world dance performances. First, to deal with challenges such as various dance movement patterns, different dancer shapes, and noises in the motion-captured data, we propose a novel feature that is invariant to anthropometric variation and body orientation. Then, we generate the notations of both lower-limb movements and upper-limb gestures. On the one hand, we utilize the hidden Markov model (HMM) to analyze the temporal dynamic characteristics of limb movements and map each lower limb movement to a corresponding dance notation. On the other hand, for upper limbs, we train a multi-class classifier based on the extremely randomized trees (Extra-Trees) to identify the notations for arm gestures. Finally, we generate the Labanotation symbols based on the above movement analysis and thus create Labanotation scores. The proposed methods can generate the spatial symbols describing directions and levels in both the support column and arm column based on motion-captured data. The generated scores are clear and reliable. Experimental results show an average recognizing accuracy of over 92% for the generated notations, which is significantly better than previous work. INDEX TERMS Labanotation, motion-captured data, hidden Markov models, extremely randomized trees, dance movement analysis.