2018
DOI: 10.1007/s10044-018-0698-z
|View full text |Cite
|
Sign up to set email alerts
|

Wavelet-like selective representations of multidirectional structures: a mammography case

Abstract: The subject of this paper is selective representation of informative texture directionality in sparse domain of four verified multiscale transforms: contourlets, curvelets, tensor and complex wavelets. Directionality of linear or piecewise linear structures is a fundamental property in recognition of anatomical structures, i.e. separating the brain regions, directional characteristics of small coronary arteries. Another important example is spicule extraction in mammograms which is based on directional analysi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 12 publications
0
2
0
Order By: Relevance
“…The recognition and classification of texture images, either in isolated conditions or "in the wild" is one of the most important tasks in computer vision, with numerous applications in material sciences [51], medicine [38], remote sensing [65], agriculture [56], etc.…”
Section: Introductionmentioning
confidence: 99%
“…The recognition and classification of texture images, either in isolated conditions or "in the wild" is one of the most important tasks in computer vision, with numerous applications in material sciences [51], medicine [38], remote sensing [65], agriculture [56], etc.…”
Section: Introductionmentioning
confidence: 99%
“…The recognition and classification of texture images, either in isolated conditions or "in the wild" is one of the most important tasks in computer vision, with numerous applications in material sciences (Naik and Khan, 2019), medicine (Jasionowska and Przelaskowski, 2019), remote sensing (Tu et al, 2019), agriculture (Safdar et al, 2019), etc.…”
Section: Introductionmentioning
confidence: 99%