36DNA-based molecular assays for determining mutational status in melanomas are time-37 consuming and costly. As an alternative, we applied a deep convolutional neural network 38 (CNN) to histopathology images of tumors from 257 melanoma patients and developed a 39 fully automated model that first selects for tumor-rich areas (Area under the curve 40 AUC=0.98), and second, predicts for the presence of mutated BRAF or NRAS. Network 41 performance was enhanced on BRAF-mutated melanomas 1.0 mm (AUC=0.83) and on 42 non-ulcerated NRAS-mutated melanomas (AUC=0.92). Applying our models to 43 histological images of primary melanomas from The Cancer Genome Atlas database also 44 demonstrated improved performances on thinner BRAF-mutated melanomas and non-45 ulcerated NRAS-mutated melanomas. We propose that deep learning-based analysis of 46 histological images has the potential to become integrated into clinical decision making 47 for the rapid detection of mutations of interest in melanoma. 48 49 50 Mutations in the BRAF oncogene are found in 50-60% of all melanomas 1 , while NRAS 51 mutations comprise an additional 15-20%. With the development of targeted therapies 2, 52 3 , determining the mutational status of BRAF and NRAS has become an integral 53 component for the management of Stage III/IV melanomas. DNA molecular assays such 54 as Sanger sequencing, pyrosequencing, and next generation sequencing (NGS) are the 55 current gold standard to determine mutational status 4 . However, these methods are costly 56 and time-consuming. Immunohistochemistry, real-time polymerase chain reaction (PCR), 57 and automated platforms 5, 6, 7 are rapid and less expensive alternatives, but are limited to 58 screening for specific mutations, such as BRAF-V600E/K or NRAS-Q61R/L, and may 59 potentially fail to identify rare mutational variants in patients that might have otherwise 60 benefited from adjuvant targeted therapy.61 62 Deep Convolutional Neural Network (CNN) methods to predict mutational status have 63 been demonstrated in other solid tumors. CNNs utilize multiple layers of convolution 64 operations, pooling layers, and fully connected layers to perform classification of images 65 to classes of interest through identification of various image features often not directly 66 detectable by the human eye. Deep CNNs, which utilize non-linear learning algorithms, 67have been successful in manipulating and processing large data sets, particularly for 68 image analysis 8 . Using images from The Cancer Genome Atlas (TCGA), a collaborative 69 cancer genomics database 9 , our group has previously developed a machine learning 70 algorithm that can predict for 6 different genes, including EGFR and STK11, in lung 71 carcinoma 10 . In breast cancer, deep learning applied to tumor microarray images has 72 been shown to predict for ER status with an 84% accuracy 11 . 73 74 In this study, we adapt our previous deep learning algorithm to a different dataset 75 comprised of histopathology images of primary melanomas resected from patients 76 pros...