2018
DOI: 10.1007/978-3-319-75417-8_53
|View full text |Cite
|
Sign up to set email alerts
|

Depth Learning with Convolutional Neural Network for Leaves Classifier Based on Shape of Leaf Vein

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 7 publications
0
2
0
Order By: Relevance
“…Chaki et al [20] proposed a novel approach by using the combination of fuzzy-color and edge-texture histogram to recognize fragmented leaf images. Some features based on Gabor filters [21,22], fractal dimension [23], locality projection analysis (SLPA) [24], kernel based principal component analysis (KPCA) [25], bag of word (BOW) [22,26] and convolutional neural networks (CNN) [27] are also used for leaf recognition.…”
Section: Introductionmentioning
confidence: 99%
“…Chaki et al [20] proposed a novel approach by using the combination of fuzzy-color and edge-texture histogram to recognize fragmented leaf images. Some features based on Gabor filters [21,22], fractal dimension [23], locality projection analysis (SLPA) [24], kernel based principal component analysis (KPCA) [25], bag of word (BOW) [22,26] and convolutional neural networks (CNN) [27] are also used for leaf recognition.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, due to the excellent performance of deep learning convolutional neural networks in the field of computer vision, they has become the main means to solve the problems of image classification, image recognition and semantic segmentation [14]- [16]. The application of deep learning methods in plant classification has achieved a good performance, and their comprehensive performance is better than that of most manual feature extraction classification methods, especially their excellent generalization performance [17]- [19].…”
Section: Introductionmentioning
confidence: 99%