More than 13% of the Spanish population suffers from hearing impairment. Despite the existence of several hearing aids and implants making sounds clearer and louder, there are some people unable to use them and, consequently, their only means of communication is Sign Language. However, this language is not widely known in the society. Consequently, those who are deaf or hard of hearing may be socially excluded and experience frustration as a result of the lack of communication. In this context, Sign Language Recognition and interpretation would help to break down the existing communication barriers and facilitate the creation of inclusive environments. With that aim, this paper presents a real-time platform to recognize and interpret finger-spelt words in Spanish Sign Language (Lengua de Signos Española). As finger spelling implies the recognition of each signed letter, a comparative analysis of different deep learning techniques to properly recognize the Spanish Sign Language alphabet has been carried out. For that, due to the lack of Spanish Sign Language datasets, the first step was to capture and build an image dataset representing its 30 letters. As there are static and in-motion letters, spatial and temporal analysis has been conducted by considering different kind of neural networks (Convolutional Neural Networks, Recurrent Neural Networks, and Vision Transformers). The experimental results highlight the good performance of the studied architectures, obtaining a maximum accuracy of 79.96% on previously unseen data. Finally, a real-time platform for the recognition and interpretation of finger-spelt words in Spanish Sign Language has been implemented making the communication possible.