2022
DOI: 10.26555/ijain.v8i3.691
|View full text |Cite
|
Sign up to set email alerts
|

Aspect-based sentiment analysis for hotel reviews using an improved model of long short-term memory

Abstract: Advances in information technology have given rise to online hotel reservation options. The user review feature is an important factor during the online booking of hotels. Generally, most online hotel booking service providers provide review and rating features for assessing hotels. However, not all service providers provide rating features or recap reviews for every aspect of the hotel services offered. Therefore, we propose a method to summarise reviews based on multiple aspects, including food, room, servic… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 26 publications
0
7
0
Order By: Relevance
“…In particular, fully connected layer 1 yielded optimal parameters for 1200 neurons with the tanh activation function, and fully connected layer 2 yielded optimal parameters for 600 neurons with the ReLU activation function. Compared with the standard LSTM model, our proposed model (modified LSTM with two fully connected layers) performs better at 10.16% [ 43 ].…”
Section: Methodsmentioning
confidence: 99%
“…In particular, fully connected layer 1 yielded optimal parameters for 1200 neurons with the tanh activation function, and fully connected layer 2 yielded optimal parameters for 600 neurons with the ReLU activation function. Compared with the standard LSTM model, our proposed model (modified LSTM with two fully connected layers) performs better at 10.16% [ 43 ].…”
Section: Methodsmentioning
confidence: 99%
“…The survey conducted by Cendani et al [7] added applying the attention mechanism layer after the LSTM layer to improve the model's performance. In contrast, the study conducted by Jayanto et al [6] did not use the attention mechanism layer. However, using Word2Vec as a word embedding technique has problems overcoming the context of words in a sentence.…”
Section: Introductionmentioning
confidence: 94%
“…The approach uses the long short-term memory (LSTM) model with word-to-vector (Word2Vec) as the word embedding technique. It obtained an f1-score value of 75.28% for the best model based on the first hidden layer size of 1,200 neurons with tanh activation function and the second discreet layer size of 600 neurons with rectified linear unit (ReLU) activation function [6].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Preprocessing is used to clean the dataset. The preprocessing process uses stages (case folding, stop-word removal, stemming, tokenization, padding, and vectorization) [34]. Next, we extract the data to the graph using the stanza library dependency graph (http://stanza.run/(accessed on 9 March 2023).…”
Section: Processmentioning
confidence: 99%