2018 IEEE 3rd International Conference on Cloud Computing and Big Data Analysis (ICCCBDA) 2018
DOI: 10.1109/icccbda.2018.8386540
|View full text |Cite
|
Sign up to set email alerts
|

Image classification based on improved random forest algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 12 publications
(10 citation statements)
references
References 10 publications
0
10
0
Order By: Relevance
“…While forming a decision tree, it is calculated with the information gain and information gain rate approaches according to which criterion or attribute value of the branch in the tree (Ozcan et al, 2020). DT is a variant of a greedy algorithm that progresses in the form of dividing and conquering in a top-down repetition, applying a set of decision rules (Man et al, 2018). In this algorithm, a tree structure is created, and class tags are expressed in the leaves of the tree.…”
Section: Decision Treesmentioning
confidence: 99%
“…While forming a decision tree, it is calculated with the information gain and information gain rate approaches according to which criterion or attribute value of the branch in the tree (Ozcan et al, 2020). DT is a variant of a greedy algorithm that progresses in the form of dividing and conquering in a top-down repetition, applying a set of decision rules (Man et al, 2018). In this algorithm, a tree structure is created, and class tags are expressed in the leaves of the tree.…”
Section: Decision Treesmentioning
confidence: 99%
“…The nodes are branched, and tree structures are formed according to the determined splitting rules such as maximum information gain, maximum information gain rate, and minimum Gini index by using the created training data. The information gain and the Gini index obtained by using attributes a to divide the sample set D is showed with given node splitting formula below [31].…”
Section: Random Forest In Spark Mllibmentioning
confidence: 99%
“…A high smoothness value indicates that the image has a smooth intensity. Smoothness can be calculated by the formula (6).…”
Section: Histogram Feature Extractionmentioning
confidence: 99%
“…Along with the development of information technology, it is possible to identify fruit maturity with the computers help [3] [4]. Identification can be done by classifying the image of tomatoes with various methods such as K-Nearest Neighbor (KNN) [4] [5], Random Forest [6], Support Vector Machine(SVM) [7] [8], Naïve Bayes [7] [8] [9] [10] [11] [12] [13], etc. In this study, identification of maturity in a set of tomatoes using the Naïve Bayes algorithm.…”
Section: Introductionmentioning
confidence: 99%