2019
DOI: 10.33640/2405-609x.1130
|View full text |Cite
|
Sign up to set email alerts
|

Stereo Photogrammetry vs Computed Tomography for 3D Medical Measurements

Abstract: The acquisition of 3D body measurements by computer vision-based remote sensing is becoming extremely substantial in clinical studies nowadays. Thus, accurate 3D models of human anatomical surfaces are required in many clinical routines like disease diagnosis, patient follow-up, surgical planning, computer assisted surgery and different biomechanical applications. These models can be generated from different imaging techniques such as computed tomography (CT). However, 3D conventional medical imaging like CT s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
5
0
2

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(8 citation statements)
references
References 23 publications
1
5
0
2
Order By: Relevance
“…As this part is not used for identification, this photogrammetry protocol can at least be used for taxonomic studies. as CT scanning, (Giacomini et al, 2019;Hussien et al, 2019), laser scanning (Baltsavias, 1999;Gibelli et al, 2018), surface scanning (Fau et al, 2016) and its accuracy and precision has been demonstrated (Bythell et al, 2001;De Menezes et al, 2010;Figueira et al, 2015;Varón-González et al, 2020). Here, we show that differences between models produced by photogrammetric means and micro-CT are small and should not impact species identification or morphometric analyses.…”
Section: Discussionsupporting
confidence: 54%
See 1 more Smart Citation
“…As this part is not used for identification, this photogrammetry protocol can at least be used for taxonomic studies. as CT scanning, (Giacomini et al, 2019;Hussien et al, 2019), laser scanning (Baltsavias, 1999;Gibelli et al, 2018), surface scanning (Fau et al, 2016) and its accuracy and precision has been demonstrated (Bythell et al, 2001;De Menezes et al, 2010;Figueira et al, 2015;Varón-González et al, 2020). Here, we show that differences between models produced by photogrammetric means and micro-CT are small and should not impact species identification or morphometric analyses.…”
Section: Discussionsupporting
confidence: 54%
“…Three dimension models, whether derived from photogrammetry or tomography, are always an interpretation of real objects. A lot of methods have been compared with photogrammetry, such as CT scanning, (Giacomini et al., 2019; Hussien et al., 2019), laser scanning (Baltsavias, 1999; Gibelli et al., 2018), surface scanning (Fau et al., 2016) and its accuracy and precision has been demonstrated (Bythell et al., 2001; De Menezes et al., 2010; Figueira et al., 2015; Varón‐González et al., 2020). Here, we show that differences between models produced by photogrammetric means and micro‐CT are small and should not impact species identification or morphometric analyses.…”
Section: Discussionmentioning
confidence: 99%
“…Scanning systems based on the photogrammetry technique are of interest to many industries. In the field of human scanning, it is possible to mention, among others, virtual fitting rooms [ 7 ], fitness and sport [ 8 ], education [ 9 ], as well as health and medicine [ 10 , 11 , 12 , 13 , 14 ]. Typically, as a result, the scanning of stationary models has been obtained, although attempts have also been made to create interactive models, which may be used, for example, in the education of anatomy [ 15 ].…”
Section: Introductionmentioning
confidence: 99%
“…Also, it is used in topography and landslide measurements [1], monitoring slope displacement [5], buildings and structures deformation monitoring [13], and traffic accident management [14]. Moreover, digital close-range photogrammetry is widely used in industry [15,16,17,18,19], archaeology [20], architectural and cultural heritage documentation [21,22], agriculture [23], mineralogy [24], and clinical and medical applications [25].…”
Section: Introductionmentioning
confidence: 99%