Abstract-We propose a real-time motion synthesis framework to control the animation of 3D avatar in real-time. Instead of relying on motion capture device as the control signal, we use low-cost and ubiquitously available 3D accelerometer sensors. The framework is developed under a data-driven fashion, which includes two steps: model learning from existing high quality motion database, and motion synthesis from the control signal. In the model learning step, we apply a non-linear manifold learning method to establish a high dimensional motion model which learned from a large motion capture database. Then, by taking 3D accelerometer sensor signal as input, we are able to synthesize high-quality motion from the motion model we learned from the previous step. The system is performing in real-time, which make it available to a wide range of interactive applications, such as character control in 3D virtual environments and occupational training.