2014 International Conference on Signal Propagation and Computer Technology (ICSPCT 2014) 2014
DOI: 10.1109/icspct.2014.6884944
|View full text |Cite
|
Sign up to set email alerts
|

Fusion classification of multispectral and panchromatic image using improved decision tree algorithm

Abstract: In this paper, efforts are made to detect the areas such as vegetation, water, soil, built-up area etc. from the satellite images. Landsat 7 ETM+ satellite is used for data set of images. It gives multispectral image with low resolution and panchromatic image with high resolution. For detecting the features of the urban area we require both spatial and spectral information in image. Hence these both images are first fused using different methods. Resultant fused image is then used for classification in various… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 11 publications
(5 citation statements)
references
References 10 publications
0
5
0
Order By: Relevance
“…Probabilities, possibilities and evidence theories were checked in [33] to achieve a robust multispectral fusion scheme. Recent studies [34][35][36][37] consider that proper decision fusion would be accomplished with the use of SVM, Bayesian networks and naïve Bayesian classifiers, weights of evidence models and decision tree algorithms, respectively. Furthermore, in [38], scene contextual information was exploited for fusion; in [39], SVM and RF classifiers were used with three adaptively weighted decision procedures; in [40], four non-parametric classifiers, namely decision tree (DT), RF, SVM and multilayer perceptron (MLP) were utilized; and in [41], a fuzzy classification with a weighted sum of the membership of imaged objects was implemented in the final classification decision.…”
Section: Multispectral Datamentioning
confidence: 99%
See 1 more Smart Citation
“…Probabilities, possibilities and evidence theories were checked in [33] to achieve a robust multispectral fusion scheme. Recent studies [34][35][36][37] consider that proper decision fusion would be accomplished with the use of SVM, Bayesian networks and naïve Bayesian classifiers, weights of evidence models and decision tree algorithms, respectively. Furthermore, in [38], scene contextual information was exploited for fusion; in [39], SVM and RF classifiers were used with three adaptively weighted decision procedures; in [40], four non-parametric classifiers, namely decision tree (DT), RF, SVM and multilayer perceptron (MLP) were utilized; and in [41], a fuzzy classification with a weighted sum of the membership of imaged objects was implemented in the final classification decision.…”
Section: Multispectral Datamentioning
confidence: 99%
“…Consequently, the memberships for MODIS data are determined using Formula (1)-normalized Euclidean distance (NED). Equation (37) outlines the computation of NED.…”
Section: Classification Of Image Objectsmentioning
confidence: 99%
“…Urban characteristics are more distinguishable in the Urban Index (UI) compared to the Normalized Difference Built-up Index (NDBI). The most accurate identifications occur when utilizing band 7 rather than band 5 (Bouhennache et al, 2015;Pratibha et al, 2014). As a result, these UI values were employed instead of data from the Normalized Difference Builtup Index (NDBI).…”
Section: 𝑈𝐼 = 𝑆𝑊𝐼𝑅 -𝑁𝐼𝑅 / 𝑆𝑊𝐼𝑅 + 𝑁𝐼𝑅 (2)mentioning
confidence: 99%
“…The first approach involves pan-sharpening the MS data to generate a fused image, followed by classification. Several studies have employed this method, including those by Shackelford and Davis (2003), Castillejo-González et al (2009), Amro et al (2011), Shingare et al (2014), Huang et al (2015Huang et al ( , 2021, Masi et al (2016), Zhong et al (2016), and. The second approach entails extracting distinct features from both the panchromatic (PAN) and MS data, which are subsequently fused for classification.…”
Section: Introductionmentioning
confidence: 99%