Musical choreography is usually completed by professional choreographers, which is very professional and time-consuming. In order to realize the intelligent choreography of musical, based on the mixed density network (MDN), this paper generates the dance matching with the target music through three steps: motion generation, motion screening, and feature matching. The choreography results in this paper have a high degree of matching with music, which makes it possible for the development of motion capture technology and artificial intelligence and computer automatic choreography based on music. In the process of motion generation, the average value of Gaussian model output by MDN is used as the bone position and the consistency of motion is measured according to the change rate of joint velocity in adjacent frames in the process of motion selection. Compared with the existing studies, the dance generated in this paper has improved in motion coherence and realism. In this paper, a multilevel music and action feature matching algorithm combining global feature matching and local feature matching is proposed. The algorithm improves the unity and coherence of music and action. The algorithm proposed in this paper improves the consistency and novelty of movement, the compatibility with music, and the controllability of dance characteristics. Therefore, the algorithm in this paper technically changes the way of artistic creation and provides the possibility for the development of motion capture technology and artificial intelligence.