Numerous studies have shown a close relationship between movement and music [7], [17], [11], [14], [16], [3], [8]. That is why Leman calls for new mediation technologies to query music in a corporeal way [9]. Thus, the goal of the presented study was to explore how movement captured by smartphone accelerometer data can be related to musical properties. Participants (N = 23, mean age = 34.6 yrs, SD = 13.7 yrs, 13 females, 10 males) moved a smartphone to 15 musical stimuli of 20s length presented in random order. Motion features related to tempo, smoothness, size, regularity, and direction were extracted from accelerometer data to predict the musical qualities łrhythmicity", łpitch level + range" and "complexitył assessed by three music experts. Motion features selected by a 20-fold lasso predicted the musical properties to the following degrees łrhythmicity" (R 2 : .47), pitch level and range (R 2 : .03) and complexity (R 2 : .10). As a consequence, we conclude that music properties can be predicted from the movement it evoked, and that an embodied approach to Music Information Retrieval is feasible. CCS CONCEPTS • Human-centered computing → Empirical studies in interaction design; Gestural input; • Information systems → Music retrieval; • Computing methodologies → Cognitive science;