Dance, as a unique form of expression, is usually accompanied by music and presented to the audience visually, improving people’s cultural and spiritual lives while also strengthening their creative energy. And dance choreography is usually created by a few skilled choreographers, either individually or together, with a high level of expertise and complexity. With the introduction of motion capture technology and artificial intelligence, computers can now do autonomous choreography based on music, and science and technology are changing the way artists produce art today. Computer music choreography must solve two fundamental issues: how to create realistic and creative dance moves without relying on motion capture and manual creation and how to improve music and dance synchronization utilizing appropriate music and movement elements and matching algorithms. This article employs a hybrid density network to generate dances that fit the target music in three steps, action generation, action screening, and feature matching, to address the aforementioned two concerns.