Physical activity patterns can reveal information about one’s health status. Built-in sensors in a smartphone, in comparison to a patient’s self-report, can collect activity recognition data more objectively, unobtrusively, and continuously. A variety of data analysis approaches have been proposed in the literature. In this study, we applied the movelet method to classify the activities performed using smartphone accelerometer and gyroscope data, which measure a phone’s acceleration and angular velocity, respectively. The movelet method constructs a personalized dictionary for each participant using training data and classifies activities in new data with the dictionary. Our results show that this method has the advantages of being interpretable and transparent. A unique aspect of our movelet application involves extracting unique information, optimally, from multiple sensors. In comparison to single-sensor applications, our approach jointly incorporates the accelerometer and gyroscope sensors with the movelet method. Our findings show that combining data from the two sensors can result in more accurate activity recognition than using each sensor alone. In particular, the joint-sensor method reduces errors of the gyroscope-only method in differentiating between standing and sitting. It also reduces errors in the accelerometer-only method when classifying vigorous activities.