Lightning often causes death, injury, and damage to various facilities and equipment. Accurately detecting the spatial location of lightning occurrence by predicting thunderstorms and lightning is of great significance. Traditional lightning detection systems detect lightning by measuring the sound, light, and electromagnetic field information radiated by lightning. These methods typically have two problems. First, the detection process of lightning signals is susceptible to electromagnetic interference. Second, the equipment cost is high and is not friendly to some lightning detection tasks only targeted at specific scenarios. In order to detect lightning more conveniently, we propose a lightning detection model based on deep learning networks. With the increase in the use of cameras in modern society, designing lightning object detection networks based on deep learning is possible. However, two problems have been found in existing practice: (1) When strong lightning meteorological phenomena occur, the lightning features in the image are covered by bright electric lights, and convolutional neural networks cannot distinguish between strong lightning scenes and strong ultraviolet scenes. (2) The performance of convolutional neural networks is often related to the model's size. The larger the model, the stronger the performance of the network. However, in practical application scenarios, computing resources are insufficient to use sufficiently large networks. In this paper, we propose a simple and effective lightning object detection network (LD‐Net) and use a foreground‐background segmentation algorithm to locate frames containing lightning in the video. After using the knowledge distillation‐based model compression method, the mAP of the lightning object detection network with a backbone net of resnet with 18‐layer (LD‐Net‐18) can reach 82.4%. We hope that the proposed LD‐Net can serve as a simple and powerful alternative to traditional lightning detection methods, enhancing efficiency in lightning detection tasks.