Quadriplegics face a communication obstacle as their physical abilities are restricted, leaving them unable to speak or use their limbs, with only upper neck being mobile. So, we propose a recognition system and a new communication language utilizing Morse code and head movements, to break this barrier. We aim to overcome the limitations of camera-based and wearable-sensor methods, including occlusion, privacy concerns, and user-inconvenience. The goal is to passively detect quadriplegics' head movements and map them to its corresponding character. The dataset including all 26 alphabet letters, was gathered in various settings, including single-user and multi-human environments, with multiple locations for each setting. For evaluation, 2% samples are randomly selected from the unseen environment to be used with seen environment as a training dataset. Based on the results, our system demonstrates practical feasibility for real-world implementation, with accuracy rates of 94% and 80% achieved in single-user and multi-human environments, respectively.