2015 IEEE International Conference on Computer and Information Technology; Ubiquitous Computing and Communications; Dependable, 2015
DOI: 10.1109/cit/iucc/dasc/picom.2015.318
|View full text |Cite
|
Sign up to set email alerts
|

A Convolutional Neural Network for Leaves Recognition Using Data Augmentation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
31
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
4
1

Relationship

1
9

Authors

Journals

citations
Cited by 82 publications
(33 citation statements)
references
References 14 publications
2
31
0
Order By: Relevance
“…Comparing the classification results on the Flavia dataset with the work performed by other authors, it can be seen that Satti et al [48] presented an accuracy of 85.9% and 93.3% for KNN and ANN classifiers, respectively, while Zhang et al [49] presented a table comparing the accuracy values with 13 other author’s schemes (on the same Flavia dataset). The accuracy average of the works presented in that table was 87.37%.…”
Section: Discussionmentioning
confidence: 99%
“…Comparing the classification results on the Flavia dataset with the work performed by other authors, it can be seen that Satti et al [48] presented an accuracy of 85.9% and 93.3% for KNN and ANN classifiers, respectively, while Zhang et al [49] presented a table comparing the accuracy values with 13 other author’s schemes (on the same Flavia dataset). The accuracy average of the works presented in that table was 87.37%.…”
Section: Discussionmentioning
confidence: 99%
“…Each subset was an independent combination of 400 patients for training and 100 patients for test images, to prevent the mixing of patient images between the training and testing images within the subsets. To effectively learn for the training dataset, we performed data augmentation ( 19 , 20 ) using image rotation from −15° to 15° in 3° steps and a zoom rate from 0.9 to 1.1 in 0.1 steps.…”
Section: Methodsmentioning
confidence: 99%
“…Although less popular, there are other excellent deep learning libraries, such as CNTK, 9 Deeplearning4j, 10 Blocks, 11 Gluon, 12 and Lasagne, 13 which can also be employed in mobile systems. Selecting among these varies according to specific applications.…”
Section: Dedicated Deep Learning Librariesmentioning
confidence: 99%