2018 International Conference on Emerging Trends and Innovations in Engineering and Technological Research (ICETIETR) 2018
DOI: 10.1109/icetietr.2018.8529117
|View full text |Cite
|
Sign up to set email alerts
|

Convolutional Neural Network Based Simile Classification System

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(3 citation statements)
references
References 12 publications
0
3
0
Order By: Relevance
“…Naive Bayes [43], [36], [44], [35], [45], [46], [47] SVM [40], [44], [35], [48], [49], [50], [51], [ 3) Approach: A variety of methods have been done with many varieties of techniques, including statistical models, sentiment analysis, pattern recognition, supervised or unsupervised machine learning [57]. The supervised and semi-supervised approach is used for building a model to classify data through a statistical and logical process.…”
Section: Methodsmentioning
confidence: 99%
“…Naive Bayes [43], [36], [44], [35], [45], [46], [47] SVM [40], [44], [35], [48], [49], [50], [51], [ 3) Approach: A variety of methods have been done with many varieties of techniques, including statistical models, sentiment analysis, pattern recognition, supervised or unsupervised machine learning [57]. The supervised and semi-supervised approach is used for building a model to classify data through a statistical and logical process.…”
Section: Methodsmentioning
confidence: 99%
“…We observe in the literature that the classical machine learning models are used as baselines of comparison against neural networks models [17], which are modern and less feature engineering oriented approaches (as their inner architecture is capable of automatic feature extraction) [16]. However, we should not forget that the classical models can be as effective as the neural ones; the main difference is that they require more manual effort on feature engineering.…”
Section: Related Workmentioning
confidence: 99%
“…However, we should not forget that the classical models can be as effective as the neural ones; the main difference is that they require more manual effort on feature engineering. Manjusha and Raseek [16] obtained a winning 79.7 % mean F-Measure for classifying articles into satire, humor, and irony with a Convolutional Neural Network(CNN) that competed against SVM, Decision Tree, K-Nearest Neighbors(KNN), and Gaussian Naive Bayes(GNB). On the other hand, de Morais et al [20] obtained 80 % F-Measure with classical models only for a Brazilian set of Portuguese News.…”
Section: Related Workmentioning
confidence: 99%