We proposed a novel deep convolutional neural network (DCNN) using inverted residuals and linear bottleneck layers for diagnosing grey blight disease on tea leaves. The proposed DCNN consists of three bottleneck blocks, two pairs of convolutional (Conv) layers, and three dense layers. The bottleneck blocks contain depthwise, standard, and linear convolution layers. A single-lens reflex digital image camera was used to collect 1320 images of tea leaves from the North Bengal region of India for preparing the tea grey blight disease dataset. The nongrey blight diseased tea leaf images in the dataset were categorized into two subclasses, such as healthy and other diseased leaves. Image transformation techniques such as principal component analysis (PCA) color, random rotations, random shifts, random flips, resizing, and rescaling were used to generate augmented images of tea leaves. The augmentation techniques enhanced the dataset size from 1320 images to 5280 images. The proposed DCNN model was trained and validated on 5016 images of healthy, grey blight infected, and other diseased tea leaves. The classification performance of the proposed and existing state-of-the-art techniques were tested using 264 tea leaf images. Classification accuracy, precision, recall, F measure, and misclassification rates of the proposed DCNN are 98.99%, 98.51%, 98.48%, 98.49%, and 1.01%, respectively, on test data. The test results show that the proposed DCNN model performed superior to the existing techniques for tea grey blight disease detection.