The future of healthcare may look completely different from the current clinic-center services. Rapidly growing and developing technologies are expected to change clinics throughout the world. However, the healthcare delivered to impaired patients, such as elderly and disabled people, possibly still requires hands-on human expertise. The aim of this study is to propose a predictive model that pre-diagnose illnesses by analyzing symptoms that are interactively taken from patients via several hand gestures during a period of time. This is particularly helpful in assisting clinicians and doctors to gain better understanding and make more accurate decisions about future plans for their patients’ situations. The hand gestures are detected, the time of the gesture is recorded and then they are associated to their designated symptoms. This information is captured in the form of provenance graphs constructed based on the W3C PROV data model. The provenance graph is analyzed by extracting several network metrics and then supervised machine-learning algorithms are used to build a predictive model. The model is used to predict diseases from the symptoms with a maximum accuracy of 84.5%.