This study focuses on recognizing and categorizing South Indian Sign Language gestures based on different age groups through transfer learning models. Sign language serves as a natural and expressive communication method for individuals with hearing impairments. This study intends to develop deep transfer learning models, namely Inception-V3, VGG-16, and ResNet-50, to accurately identify and classify double-handed gestures in South Indian languages, like Kannada, Tamil, and Telugu. A dataset comprising 30,000 images of double-handed gestures, with 10,000 images for each considered age group (1-7, 8-25, and 25 and above), is utilized to enhance and modify the models for improved classification performance. Amongst the tested models, Inception-V3 achieves the best performance with a test precision of 95.20% and validation accuracy of 92.45%, demonstrating its effectiveness in accurately categorizing images of double-handed gestures into ten different classes.