Rice leaf infections are a common hazard to rice production, affecting many farmers all over the world. Early detection and treatment of rice leaf infection are critical for promoting healthy rice plant growth and ensuring adequate supply for the fast-growing population. Computer-assisted rice leaf disease diagnoses are hampered due to strong image backgrounds. Popular Convolutional Neural Network (CNN) architecture extracts the features from images and diagnoses the disease to address the issues above. However, this method is best suitable for segmented images and gives low accuracy with real-time images. In this case, the Internet of Things is a paradigm shift that collects agro-meteorological information that effectively helps diagnose rice diseases. Motivated by the usefulness of CNN models and agricultural IoT, a novel multimodal data fusion framework named Rice-Fusion is proposed to diagnose rice disease. Rice disease diagnosis based on a single modality may not be accurate, and hence the fusion of heterogeneous modalities is essential for robust and reliable disease diagnosis. This gives a new dimension to the domain of rice disease diagnosis. The dataset was collected manually with 3200 rice health category samples using two modalities, namely agro-meteorological sensors and a camera. The Rice-Fusion framework initially extracts the numerical features from agro-meteorological data collected from sensors. Next, it extracts the visual features from the captured rice images. These extracted features are further fused using a concatenation layer followed by a dense layer, which provides single output for diagnosing the rice disease. The testing accuracy of Rice-Fusion is 95.31% as opposed to other unimodal framework accuracies of 82.03% and 91.25% based on CNN and Multi-Layer Perceptron (MLP) architectures, respectively. Experimental results analysis demonstrates that the proposed Rice-Fusion multimodal data fusion framework outperforms the outcome of unimodal frameworks.
The pathogens such as fungi and bacteria can lead to rice diseases that can drastically impair crop production. Because the illness is difficult to control on a broad scale, crop field monitoring is one of the most effective methods of control. It allows for early detection of the disease and the implementation of preventative measures. Disease severity estimation based on digital picture analysis, where the pictures are obtained from the rice field using mobile devices, is one of the most effective control strategies. This paper offers a method for quantifying the severity of three rice crop diseases (brown spot, blast, and bacterial blight) that can determine the stage of plant disease. A total of 1200 images of rice illnesses and healthy images make up the input dataset. With the help of agricultural experts, the diseased zone was labeled according to the disease type using the Make Sense tool. More than 75% of the images in the dataset correspond to one disease label, healthy plants represent more than 15%, and multiple diseases represent 5% of the images labeled. This paper proposes a novel artificial intelligence rice grade model that uses an optimized faster-region-based convolutional neural network (FRCNN) approach to calculate the area of leaf instances and the infected regions. EfficientNet-B0 architecture was used as a backbone as the network shows the best accuracy (96.43%). The performance was compared with the CNN architectures: VGG16, ResNet101, and MobileNet. The model evaluation parameters used to measure the accuracy are positive predictive value, sensitivity, and intersection over union. This severity estimation method can be further deployed as a tool that allows farmers to obtain perfect predictions of the disease severity level based on lesions in the field conditions and produce crops more organically.
Rice disease classification is vital during the cultivation of rice crops. However, rice diseases were initially detected by visual examination from agricultural experts. Later the detection process progressed to automation, which involved images. The images captured lead to a lack of supporting information. The traditional approaches are less accurate when used with real time images. To address this limitation, a novel Rice Transformer is proposed in the paper that merges inputs from agricultural sensors and image data captured from the fields simultaneously. The proposed system consists of two branches: the sensor and image branches. Specifically, the attention approach is employed to extract the features from both modalities. Later, the extracted features are sent to the cross-attention module as input in a crisscross fashion, enhancing the ability to identify the features specific to rice diseases. The extracted features are further pooled, merged, and later passed through the Softmax classifier to classify the rice disease precisely. The dataset collected is a customized dataset with 4200 samples collected on a real-time basis from rice farms. The experiments conducted on the dataset represent that the proposed approach outperforms all the other fusion and attention models considered for comparison in this paper. The ablation analysis and performance metrics are measured to determine the effectiveness of the proposed system. The results achieved are quite promising as the proposed Rice transformer model achieves an accuracy of 97.38% for controlling rice disease.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.