2017 10th Biomedical Engineering International Conference (BMEiCON) 2017
DOI: 10.1109/bmeicon.2017.8229106
|View full text |Cite
|
Sign up to set email alerts
|

Classification of thyroid ultrasound images based on shape features analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 13 publications
0
4
0
Order By: Relevance
“…Our irregularity method and the methods developed in 5,7,12,14 both measure global irregularity of the nodule, but our method uses convexity and ellipticity variance, providing a more robust and accurate assessment of nodule irregularity without being excessive. Furthermore, our method has a new feature extraction step that incorporates and measures of local irregularity of margin such as lobulation and angulation.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…Our irregularity method and the methods developed in 5,7,12,14 both measure global irregularity of the nodule, but our method uses convexity and ellipticity variance, providing a more robust and accurate assessment of nodule irregularity without being excessive. Furthermore, our method has a new feature extraction step that incorporates and measures of local irregularity of margin such as lobulation and angulation.…”
Section: Methodsmentioning
confidence: 99%
“…Several studies to automatically quantify features based on the standardized TIRADS categories for classifying thyroid nodules have been reported, 12,13 but the work on extracting correct TIRADS features from ultrasound images for annotation purposes remains limited. Zulfanahri et al 5 analyzed and classified the margin irregularity of thyroid nodule into regular or irregular class using rectangularity, convexity, and tortuosity features with an SVM classifier. The study reported an accuracy of 91.52% (91.80% sensitivity and 91.35% specificity) over a set of 165 images.…”
Section: Background and Related Workmentioning
confidence: 99%
See 2 more Smart Citations