PremiseAutomated disease, weed, and crop classification with computer vision will be invaluable in the future of agriculture. However, existing model architectures like ResNet, EfficientNet, and ConvNeXt often underperform on smaller, specialised datasets typical of such projects.MethodsWe address this gap with informed data collection and the development of a new convolutional neural network architecture, PhytNet. Utilising a novel dataset of infrared cocoa tree images, we demonstrate PhytNet's development and compare its performance with existing architectures. Data collection was informed by spectroscopy data, which provided useful insights into the spectral characteristics of cocoa trees. Cocoa was chosen as a focal species due to the diverse pathology of its diseases, which pose significant challenges for detection.ResultsResNet18 showed some signs of overfitting, while EfficientNet variants showed distinct signs of overfitting. By contrast, PhytNet displayed excellent attention to relevant features, almost no overfitting, and an exceptionally low computation cost of 1.19 GFLOPS.ConclusionsWe show that PhytNet is a promising candidate for rapid disease or plant classification and for precise localisation of disease symptoms for autonomous systems. We also show that the most informative light spectra for detecting cocoa disease are outside the visible spectrum and that efforts to detect disease in cocoa should be focused on local symptoms, rather than the systemic effects of disease.