Background and Objectives: Device-assisted enteroscopy (DAE) has a significant role in approaching enteric lesions. Endoscopic observation of ulcers or erosions is frequent and can be associated with many nosological entities, namely Crohn’s disease. Although the application of artificial intelligence (AI) is growing exponentially in various imaged-based gastroenterology procedures, there is still a lack of evidence of the AI technical feasibility and clinical applicability of DAE. This study aimed to develop and test a multi-brand convolutional neural network (CNN)-based algorithm for automatically detecting ulcers and erosions in DAE. Materials and Methods: A unicentric retrospective study was conducted for the development of a CNN, based on a total of 250 DAE exams. A total of 6772 images were used, of which 678 were considered ulcers or erosions after double-validation. Data were divided into a training and a validation set, the latter being used for the performance assessment of the model. Our primary outcome measures were sensitivity, specificity, accuracy, positive predictive value (PPV), negative predictive value (NPV), and an area under the curve precision–recall curve (AUC-PR). Results: Sensitivity, specificity, PPV, and NPV were respectively 88.5%, 99.7%, 96.4%, and 98.9%. The algorithm’s accuracy was 98.7%. The AUC-PR was 1.00. The CNN processed 293.6 frames per second, enabling AI live application in a real-life clinical setting in DAE. Conclusion: To the best of our knowledge, this is the first study regarding the automatic multi-brand panendoscopic detection of ulcers and erosions throughout the digestive tract during DAE, overcoming a relevant interoperability challenge. Our results highlight that using a CNN to detect this type of lesion is associated with high overall accuracy. The development of binary CNN for automatically detecting clinically relevant endoscopic findings and assessing endoscopic inflammatory activity are relevant steps toward AI application in digestive endoscopy, particularly for panendoscopic evaluation.