Several pathologies can alter the way people walk, i.e., their gait. Gait analysis can be used to detect such alterations and, therefore, help diagnose certain pathologies or assess people’s health and recovery. Simple vision-based systems have a considerable potential in this area, as they allow the capture of gait in unconstrained environments, such as at home or in a clinic, while the required computations can be done remotely. State-of-the-art vision-based systems for gait analysis use deep learning strategies, thus requiring a large amount of data for training. However, to the best of our knowledge, the largest publicly available pathological gait dataset contains only 10 subjects, simulating five types of gait. This paper presents a new dataset, GAIT-IT, captured from 21 subjects simulating five types of gait, at two severity levels. The dataset is recorded in a professional studio, making the sequences free of background camouflage, variations in illumination and other visual artifacts. The dataset is used to train a novel automatic gait analysis system. Compared to the state-of-the-art, the proposed system achieves a drastic reduction in the number of trainable parameters, memory requirements and execution times, while the classification accuracy is on par with the state-of-the-art. Recognizing the importance of remote healthcare, the proposed automatic gait analysis system is integrated with a prototype web application. This prototype is presently hosted in a private network, and after further tests and development it will allow people to upload a video of them walking and execute a web service that classifies their gait. The web application has a user-friendly interface usable by healthcare professionals or by laypersons. The application also makes an association between the identified type of gait and potential gait pathologies that exhibit the identified characteristics.