2022
DOI: 10.3390/e24040509
|View full text |Cite
|
Sign up to set email alerts
|

Entropy Estimators in SAR Image Classification

Abstract: Remotely sensed data are essential for understanding environmental dynamics, for their forecasting, and for early detection of disasters. Microwave remote sensing sensors complement the information provided by observations in the optical spectrum, with the advantage of being less sensitive to adverse atmospherical conditions and of carrying their own source of illumination. On the one hand, new generations and constellations of Synthetic Aperture Radar (SAR) sensors provide images with high spatial and tempora… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 48 publications
0
4
0
Order By: Relevance
“…The first is suitable for fully developed speckle and is a limiting case of the second model. This is interesting due to its versatility in accurately representing regions with different roughness properties [24]. We denote Z ∼ Γ SAR (L, µ) and Z ∼ G 0 I (α, γ, L) to indicate that Z follows the distributions characterized by the respective probability density functions (pdfs):…”
Section: Statistical Modeling Of Intensity Sar Datamentioning
confidence: 99%
See 1 more Smart Citation
“…The first is suitable for fully developed speckle and is a limiting case of the second model. This is interesting due to its versatility in accurately representing regions with different roughness properties [24]. We denote Z ∼ Γ SAR (L, µ) and Z ∼ G 0 I (α, γ, L) to indicate that Z follows the distributions characterized by the respective probability density functions (pdfs):…”
Section: Statistical Modeling Of Intensity Sar Datamentioning
confidence: 99%
“…Several authors have explored adaptations to Vasicek's estimator. We consider three estimators known for their superior performance [24]:…”
Section: Estimation Of the Shannon Entropymentioning
confidence: 99%
“…The entropy is ultimately defined based on this probability distribution. Classical one-dimensional image entropy includes fuzzy entropy [4,5], Kapur entropy [6], cross entropy [7,8], and Shannon entropy [9]. In addition to being applied to image segmentation or classification, image entropy has also been applied to image filtering and denoising [10,11].…”
Section: Image Entropymentioning
confidence: 99%
“…The entropy associated with a random variable is a measure of its uncertainty or diversity, taking large values for a highly unpredictable random variable (i.e., all outcomes equally probable) and low values for a highly predictable one (i.e., one or few outcomes much more probable than the others). As such, the concept has found multiple applications in a variety of fields including but not limited to nonlinear dynamics, statistical physics, information theory, biology, neuroscience, cryptography, and linguistics [ 1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 , 10 , 11 , 12 , 13 ].…”
Section: Introductionmentioning
confidence: 99%