Breast cancer is one of the most prevalent causes of death among women across the globe. Early detection is the best strategy for reducing the mortality rate. Currently, mammography is the standard screening modality, which has its shortcomings. To complement this modality, thermal infrared-based Computer-Aided Diagnosis (CADx) tools have been presented as economical, less hazardous, and a suitable solution for various age groups. Although a viable solution, most CADx systems are built primarily from frontal breast thermograms, and are likely to miss lesions that may develop on the sides. Additionally, these systems often disregard critical clinical data, such as risk factors. This paper presents a novel CADx system that utilizes deep learning techniques for breast cancer detection. The system incorporates multiple breast thermogram views and corresponding patient clinical data to improve the accuracy of the diagnosis. We describe the methodology of the system, including the extraction of regions of interest from images and the use of transfer learning to train three different models. We evaluate the performance of the models and compare them to similar works from the literature. The results demonstrate that using multi-inputs outperforms single-input models and achieves an overall accuracy of 90.48%, a sensitivity of 93.33%, and an AUROC curve of 0.94. This approach could offer a more cost-effective and less hazardous screening option for breast cancer detection, particularly for a wide range of age groups.