The human tongue has been long believed to be a window to provide important insights into a patient's health in medicine. The present study introduced a novel approach to predict patient age, gender, and weight inferences based on tongue images using pre-trained deep CNNs. Our results demonstrated that the deep CNN models trained on dorsal tongue images produced excellent results for age prediction with a Pearson correlation coefficient of 0.71 and a mean absolute error of 8.5 years. We also obtained an excellent classification of gender, with a mean accuracy of 80% and an AUC of 88%. The model also obtained a moderate level of accuracy for weight prediction, with a Pearson correlation coefficient of 0.39 and a mean absolute error of 9.06 kilograms. These findings support our hypothesis that the human tongue contains crucial information about a patient. This study demonstrated the feasibility of using the pre-trained deep CNNs along with a large tongue image dataset to develop computational models to predict patient medical conditions for non-invasive, convenient, and inexpensive patient health monitoring and diagnosis.