2020
DOI: 10.1016/j.biosystemseng.2020.03.020
|View full text |Cite
|
Sign up to set email alerts
|

Identification and recognition of rice diseases and pests using convolutional neural networks

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
139
0
13

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 374 publications
(152 citation statements)
references
References 13 publications
0
139
0
13
Order By: Relevance
“…Для задачи классификации в работе была использована концепция «transfer learning». Если современные нейронные сети используют анализ простейших графических примитивов на нижних слоях, то концепция «transfer learning» зарекомендовала себя уже на многих более сложных задачах [7,16]. Предобученная на большом количестве изображений нейронная сеть дает преимущество в стоимости и скорости обучения.…”
Section: рис 1 размеченные изображения из датасета (пример)unclassified
“…Для задачи классификации в работе была использована концепция «transfer learning». Если современные нейронные сети используют анализ простейших графических примитивов на нижних слоях, то концепция «transfer learning» зарекомендовала себя уже на многих более сложных задачах [7,16]. Предобученная на большом количестве изображений нейронная сеть дает преимущество в стоимости и скорости обучения.…”
Section: рис 1 размеченные изображения из датасета (пример)unclassified
“…Step 5: The cluster center corresponds to the target firefly, giving the highest fluorescence brightness. Calculate the Euclidean distance of the remaining sample points relative to each cluster center and assign different fluorescence brightness according to formula (14).…”
Section: • Basic Steps Of the Fcm-km Algorithmmentioning
confidence: 99%
“…Under the 10-fold cross-validation strategy, it was found that the average recognition rate of 10 common rice diseases was 95.48%. Literature [14] proposes a new stacked CNN architecture that uses two-stage training to significantly reduce the size of the model while maintaining high classification accuracy. Based on our experimental results, it was shown that compared to VGG16, by using stacked CNN, the test accuracy reached 95% while the model size was reduced by 98%.…”
Section: Introductionmentioning
confidence: 99%
“…Many researchers have worked on the automatic diagnosis of rice diseases through conventional means such as pattern recognition techniques ( Phadikar & Sil, 2008 ; Rahman et al, 2020 ), support vector machine ( Phadikar, Sil & Das, 2012 ; Prajapati, Shah & Dabhi, 2017 ), digital image processing techniques ( Arnal Barbedo, 2013 ; Zhou et al, 2013 ; Sanyal et al, 2008 ; Sanyal & Patel, 2008 ) and computer vision ( Asfarian et al, 2014 ) for enhancing the accuracy and rapidity of diagnosing the results. In an earlier study, Phadikar & Sil (2008) proposed a rice disease identification approach where the diseased rice images were classified utilizing Self Organizing Map (SOM) (via neural network) in which the train images were obtained by extracting the features of the infected parts of the leave while four different types of images were applied for testing purposes.…”
Section: Introductionmentioning
confidence: 99%
“… Fuentes et al (2017) proposed a deep-learning-based approach using three architectures, namely, Faster Region-based Convolutional Neural Network (Faster R-CNN), Region-based Fully Convolutional Network (R-FCN), and Single Shot Multibox Detector (SSD) that can effectively recognize nine different types of diseases and pests in tomato plants. In a recent study, Rahman et al (2020) developed a CNN approach for detecting diseases and pests (five classes of diseases, three classes of pests and one class of healthy plant and others) from rice plant images. A total number of 1,426 images were collected that were captured using four different types of cameras and the system achieved a mean validation accuracy of 94.33 %.…”
Section: Introductionmentioning
confidence: 99%