2013
DOI: 10.3390/ijgi2030577
|View full text |Cite
|
Sign up to set email alerts
|

Evaluation of Model Validation Techniques in Land Cover Dynamics

Abstract: This paper applies different methods of map comparison to quantify the characteristics of three different land change models. The land change models used for simulation are termed as -Stochastic Markov (St_Markov)‖, -Cellular Automata Markov (CA_Markov)‖ and -Multi Layer Perceptron Markov (MLP_Markov)‖ models. Various model validation techniques such as per category method, kappa statistics, components of agreement and disagreement, three map comparison and fuzzy methods have then been applied. A comparative a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
43
0
3

Year Published

2013
2013
2023
2023

Publication Types

Select...
10

Relationship

2
8

Authors

Journals

citations
Cited by 78 publications
(48 citation statements)
references
References 45 publications
2
43
0
3
Order By: Relevance
“…In this study, owing to the CA-Markov model being a spatiotemporal model, kappa value alone cannot accurately reflect spatial changes [59]. The Map Comparison Kit 3 is a good alternative to present the different errors not only in quantity, but also in location [60]. By considering this reason, the Map Comparison Kit 3 was selected to validate our simulation results [61].…”
Section: Transition Probabilitymentioning
confidence: 99%
“…In this study, owing to the CA-Markov model being a spatiotemporal model, kappa value alone cannot accurately reflect spatial changes [59]. The Map Comparison Kit 3 is a good alternative to present the different errors not only in quantity, but also in location [60]. By considering this reason, the Map Comparison Kit 3 was selected to validate our simulation results [61].…”
Section: Transition Probabilitymentioning
confidence: 99%
“…Cohen's kappa measures the agreement between two raters. This approach is useful for identifying the degree that the cells are identical in two maps [30]. K is measured as follows:…”
Section: Model Validationmentioning
confidence: 99%
“…The overall accuracies of the classified images (1989,1999, and 2009) were, respectively, found to be 86.48%, 90.69%, and 94.83%, with Kappa coefficients of 0.86, 0.91, and 0.95 ( Table 3). Note that the Kappa coefficient is a measure of the proportional (or percentage) improvement by the classifier over a purely random assignment to classes [56,57]. On the other hand, the user's accuracy measures the proportion of each land cover class, which is correct whereas producer's accuracy measures the proportion of the land base, which is correctly classified.…”
Section: Derivation Of Land Cover Mapsmentioning
confidence: 99%