“…The embedding process translates semantic meaning into geometric meaning, Word2Vec [8,23,25], Global vectors (GloVe) for word representation [18], n-grams [11,26], word-embedding [12], BOW [7] are the pioneers of word representativeness approaches. Classification Step: Different methods of machine learning and deep learning are used by different researchers, and machine learning algorithms, such as SVM [3,4,7,9,11,14,15,18,21,24,26,[37][38][39], Decision tree [1,4,13,26], logistic regression [15,20,25], Naïve Bayes [9,40], random forest [25,41,42], XGBoost [25]) were applied. Recently, deep learning methods are used like CNN [12,18,43], a hybrid of CNN-LSTM [25], and LRCN [12].…”