2021
DOI: 10.33480/techno.v18i1.2056
|View full text |Cite
|
Sign up to set email alerts
|

Prediction of Hotel Booking Cancellation Using Deep Neural Network and Logistic Regression Algorithm

Abstract: Booking cancellation is a key aspect of hotel revenue management as it affects the room reservation system. Booking cancellation has a significant effect on revenue which has a significant impact on demand management decisions in the hotel industry. In order to reduce the cancellation effect, the hotel applies the cancellation model as the key to addressing this problem with the machine learning-based system developed. In this study, using a data collection from the Kaggle website with the name hotel-booking-d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 3 publications
0
1
0
Order By: Relevance
“…Optimizer is an algorithm used for gradient optimization on neural networks such as weights and learning rates to reduce losses [30]. This study uses five optimizations derived from the Keras model, namely the optimizer stochastic gradient descent (SGD) is a deep learning model that optimizes functions by following a gradient that has nosy with decreasing step size [31], optimization algorithm adaptive moment estimation (Adam) which is an extension for SGD [32], adaptive gradient algorithm (Adagrade) is a modified stochastic gradient descent algorithm with a learning rate per parameter [30], Adadelta is a stochastic gradient descent method based on the dimension adaptive learning rate, and root mean square propagation (RMSProp) is a method in which the learning rate is adjusted for each parameter.…”
Section: ) Optimizermentioning
confidence: 99%
“…Optimizer is an algorithm used for gradient optimization on neural networks such as weights and learning rates to reduce losses [30]. This study uses five optimizations derived from the Keras model, namely the optimizer stochastic gradient descent (SGD) is a deep learning model that optimizes functions by following a gradient that has nosy with decreasing step size [31], optimization algorithm adaptive moment estimation (Adam) which is an extension for SGD [32], adaptive gradient algorithm (Adagrade) is a modified stochastic gradient descent algorithm with a learning rate per parameter [30], Adadelta is a stochastic gradient descent method based on the dimension adaptive learning rate, and root mean square propagation (RMSProp) is a method in which the learning rate is adjusted for each parameter.…”
Section: ) Optimizermentioning
confidence: 99%