2020
DOI: 10.1007/978-3-030-55789-8_17
|View full text |Cite
|
Sign up to set email alerts
|

S2RSCS: An Efficient Scientific Submission Recommendation System for Computer Science

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 10 publications
0
4
0
Order By: Relevance
“…In 2020, Dac et al [26] used a new technique for this problem by investigating numerous deep learning methods as LSTM [11], GRU [5], Conv1D, and the ensemble method. Interestingly, the experimental results [26] could outperform the previous results [12] in terms of Top 1, Top 3, Top 5, and Top 10 accuracies. However, the dataset's volume that Wang used has only 14012 samples, which is not large enough to be reliable or high confident results.…”
Section: Related Workmentioning
confidence: 74%
See 1 more Smart Citation
“…In 2020, Dac et al [26] used a new technique for this problem by investigating numerous deep learning methods as LSTM [11], GRU [5], Conv1D, and the ensemble method. Interestingly, the experimental results [26] could outperform the previous results [12] in terms of Top 1, Top 3, Top 5, and Top 10 accuracies. However, the dataset's volume that Wang used has only 14012 samples, which is not large enough to be reliable or high confident results.…”
Section: Related Workmentioning
confidence: 74%
“…Wang used Chi-square and TF-IDF as feature engineering layers and a linear regression model for the classifier. Later, with the same data, with a simple deep learning approach [12], Son et al outweighed the performance of Wang's approach by using MLP (Multi-layer perceptron) as a classifier instead of using logistic regression with accuracy (Top 3) of 89.07%. In 2020, Dac et al [26] used a new technique for this problem by investigating numerous deep learning methods as LSTM [11], GRU [5], Conv1D, and the ensemble method.…”
Section: Related Workmentioning
confidence: 99%
“…The accuracy of the classifier model for correctly recommending a journal within the top 3 suggestions was 0.6137. This algorithm was extended by Huynh et al [ 14 ] in their Scientific Submission Recommendation System for Computer Science (S2RSCS) algorithm using multilayer perceptrons as classifiers. Evaluated on the Wang et al’s [ 13 ] computer science data set, they achieved an accuracy of 0.8907 when using the title, abstract, and keywords as input to predict the top 3 computer science journals.…”
Section: Discussionmentioning
confidence: 99%
“…Wang et al’s [ 13 ] “publication recommender system” uses term frequency-inverse document frequency (TD-IDF), chi-squared feature selection, and softmax regression classification to suggest journals for computer science publications. This algorithm was subsequently extended by Huynh et al [ 14 ] using multilayer perceptrons as a classifier and by Nguyen et al [ 15 ] introducing a one-dimensional convolutional neural network. Accuracy measured on the same computer science data set showed increased improvement with the incorporation of artificial intelligence methods.…”
Section: Introductionmentioning
confidence: 99%