2008
DOI: 10.1016/j.eswa.2006.12.016
|View full text |Cite
|
Sign up to set email alerts
|

Back-propagation neural network based importance–performance analysis for determining critical service attributes

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
84
0
3

Year Published

2014
2014
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 130 publications
(87 citation statements)
references
References 64 publications
0
84
0
3
Order By: Relevance
“…is an essential two-part question. It can be answered with respect to the economic value of assets [62]. To find the main factors that appeal to tourists is relevant to economic value and a reflection of the cultural value of the sites: Which factors in heritage buildings would be the main factors appealing to tourists?…”
mentioning
confidence: 99%
“…is an essential two-part question. It can be answered with respect to the economic value of assets [62]. To find the main factors that appeal to tourists is relevant to economic value and a reflection of the cultural value of the sites: Which factors in heritage buildings would be the main factors appealing to tourists?…”
mentioning
confidence: 99%
“…Then the performances of several network configurations (with 2, 5, 7, 10, 12-13 hidden neurons) are measured by three indicators: the mean absolute error (MAE), the RMSE and goodness-of-fit (R 2 ). Note that, the MAE and RMSE approach to 0 indicate that BPNN model has precise prediction ability whereas R 2 close to 1 indicates that BPNN model has excellent goodness-of-fit [9]. The best performing network is the network with the two hidden neurons (Training-MAE = 0.588, RMSE = 0.770 and R 2 = 0.442; Testing-MAE = 0.617, RMSE = 0.802) since its RMSE value in training being the lowest one and its R 2 value being the highest one.…”
Section: Methodsologymentioning
confidence: 99%
“…Section 4 concludes the paper. [9] 20-31-1 Hyperbolic tangent/ Hyperbolic tangent Deng and Pei (2009) [11] 20-26-1 Hyperbolic tangent/ Hyperbolic tangent Chen, Lin and Lin (2010) [12] 7-15-1 Sigmoid/ Sigmoid Mikulić and Prebežac (2012) [13] 8-10-1 Hyperbolic tangent/ Hyperbolic tangent Hosseini and Bideh (2013) [14] 7-7-1 Hyperbolic tangent/ Hyperbolic tangent Krešić, Mikulić, and Kožić (2013) [10] 11-11-1 Hyperbolic tangent/ Identity…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Tenfold cross validation process was used for train and test process. For determining the relative importance of each input to the output several studies have performed [50][51][52]. Accuracy of ANN is compared with Multiple Linear Regression (MLR) and Nonlinear Least Square Fitting (NLSF) methods.…”
Section: Artificial Neural Networkmentioning
confidence: 99%