1976
DOI: 10.1109/tsmc.1976.5408777
|View full text |Cite
|
Sign up to set email alerts
|

A Comparative Study of Texture Measures for Terrain Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
495
0
21

Year Published

1996
1996
2022
2022

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 1,301 publications
(534 citation statements)
references
References 2 publications
5
495
0
21
Order By: Relevance
“…8, minimum and total variations are not su cient to discriminate between all textures [42][43][44] (as they are very similar to the contrast [45] and the gray level di erence statistics [46]), but it is recognized that in general, di erent operators correspond to di erent images. In this work, speciÿc knowledge of the acquisition process is not applied and we try to demonstrate the validity of our neural implementation to a greater extent than the operator's one.…”
Section: Comparisons With Other Segmentation Methodsmentioning
confidence: 99%
“…8, minimum and total variations are not su cient to discriminate between all textures [42][43][44] (as they are very similar to the contrast [45] and the gray level di erence statistics [46]), but it is recognized that in general, di erent operators correspond to di erent images. In this work, speciÿc knowledge of the acquisition process is not applied and we try to demonstrate the validity of our neural implementation to a greater extent than the operator's one.…”
Section: Comparisons With Other Segmentation Methodsmentioning
confidence: 99%
“…In this paper, a texture information comparison function is proposed by comparing the correlation texture information and autocorrelation texture information between different neighborhoods. The correlation texture information is defined as [28]:…”
Section: Texture Information Comparison Functionmentioning
confidence: 99%
“…These include Bayes classifiers assuming multivariate Gaussian distributions for the features [3], [4], [5], [6]; Fisher transformation [7], [8]; nonparametric nearest neighbor classification [9], [10], [11], [12]; classification trees [8]; learning vector quantization [13], [14]; feed-forward neural networks [15]; and recently support vector machine [16,17] and multiple histogram combined with self organized map [18] . In some earlier cases, the statistical modelling after the feature extraction is just thresholding [19], [20], [21], [22]; or simple extremum picking [23], [24], [25].…”
Section: Literaturementioning
confidence: 99%