Knowledge of the 10B microdistribution is of great relevance in BNCT studies. Since 10B concentration assesment through neutron autoradiography depends on the correct quantification of tracks in a nuclear track detector, image acquisition and processing conditions should be controlled and verified, in order to obtain accurate results to be applied in the frame of BNCT. With this aim, an image verification process was proposed, based on parameters extracted from the quantified nuclear tracks. Track characterization was performed by selecting a set of morphological and pixel-intensity uniformity parameters from the quantified objects (area, diameter, roundness, aspect ratio, heterogeneity and clumpiness). Their distributions were studied, leading to the observation of varying behaviours in images generated by different samples and acquisition conditions. The distributions corresponding to samples coming from the BNC reaction showed similar attributes in each analyzed parameter, proving to be robust to the experimental process, but sensitive to light and focus conditions. Considering those observations, a manual feature extraction was performed as a pre-processing step. A Support Vector Machine (SVM) and a fully dense Neural Network (NN) were optimized, trained, and tested. The final performance metrics were similar for both models: 93%-93% for the SVM, vs 94%-95% for the NN in accuracy and precision respectively. Based on the distribution of the predicted class probabilities, the latter had a better capacity to reject inadequate images, so the NN was selected to perform the image verification step prior to quantification. The trained NN was able to correctly classify the images regardless of their track density. The exhaustive characterization of the nuclear tracks provided new knowledge related to the autoradiographic images generation. The inclusion of machine learning in the analysis workflow proves to optimize the boron determination process and paves the way for further applications in the field of boron imaging.