Complete tetraplegia can deprive a person of hand function. Assistive technologies may improve autonomy but needs for ergonomic interfaces for the user to pilot these devices still persist. Despite the paralysis of their arms, people with tetraplegia may retain residual shoulder movements. In this work we explored these movements as a mean to control assistive devices. Methods: We captured shoulder movement with a single inertial sensor and, by training a support vector machine based classifier, we decode such information into user intent. Results: The setup and training process take only a few minutes and so the classifiers can be user specific. We tested the algorithm with 10 able body and 2 spinal cord injury participants. The average classification accuracy was 80% and 84%, respectively. Conclusion: The proposed algorithm is easy to set up, its operation is fully automated, and achieved results are on par with state-of-the-art systems. Significance: Assistive devices for persons without hand function present limitations in their user interfaces. Our work present a novel method to overcome some of these limitations by classifying user movement and decoding it into user intent, all with simple setup and training and no need for manual tuning. We demonstrate its feasibility with experiments with end users, including persons with complete tetraplegia without hand function.