This paper presents an implementation of the Active Appearance Model that is able to track a face on a mobile device in real-time. We achieve this performance by discarding an explicit texture model, using fixed-point arithmetic for much of the computation, applying a sequence of models with increasing complexity, and exploiting a sparse basis projection via Haar-like features. Our results show that the Haar-like feature basis achieves similar performance to more traditional approaches while being more suitable for a mobile device. Finally, we discuss mobile applications of the system such as face verification, teleconferencing and human-computer interaction.