This paper presents an intelligent Android system developed for automatic recognition of both Arabic and American sign languages, as well as for the teaching and learning of these sign languages. It contains two subsystems. The first subsystem, the Sensory Smart Glove System (SSG-Sys), is based on Internet of Things (IoT) and is designed for automatic sign language recognition. It comprises a smart glove equipped with five flex sensors, which measure the bending of fingers according to the gestures being performed, and an MPU-6050 accelerometer sensor to track the hand's position across three axes (X, Y, Z). The sensed data are processed by an Arduino Nano microcontroller, and the text of the recognized gesture is transferred via HC-05 Bluetooth module to an Android phone. This phone displays the text and converts it into audible voice using an Android application. The SSG-Sys results demonstrated high recognition accuracy rates for Arabic Sign Language (ArSL) (98.42%) and American Sign Language (ASL) (98.
22%). The second subsystem is the Mobile Augmented Sign LanguageLearning System (MASLL-Sys). It is a mobile educational app that leverages marker-based augmented reality technology, to enhance and make the sign language learning process more realistic and effective. It consists of five main modules: registration, learning, augmented learning, tests, and student module. Overall, the performance of the proposed intelligent system was evaluated by a group of experts, who revealed that it is a promising tool for sign language recognition and learning.