The detection of newly appearing and changing pigmented skin lesions (PSLs) is essential for timely diagnosis of cutaneous melanoma. Total body skin examination (TBSE) procedures, currently practiced for this purpose, can be extremely time-consuming for patients with numerous lesions. In addition, these procedures are prone to subjectivity when selecting PSLs for baseline image comparison, increasing the risk of missing a developing cancer. To address this issue, we propose a new photogrammetry-based total body scanning system allowing for skin surface image acquisition using cross-polarized light. Equipped with 21 high-resolution cameras and a turntable, this scanner automatically acquires a set of overlapping images, covering 85%-90% of the patient's skin surface. These images are used for the automated mapping of PSLs and their change estimation between explorations. The maps produced relate images of individual lesions with their locations on the patient's body, solving the body-to-image and image-to-image correspondence problem in TBSEs. Currently, the scanner is limited to patients with sparse body hair and, for a complete skin examination, the scalp, palms, soles and inner arms should be photographed manually. The initial tests of the scanner showed that it can be successfully applied for automated mapping and temporal monitoring of multiple lesions: PSLs relevant for follow-up were repeatedly mapped in several explorations. Moreover, during the baseline image comparison, all lesions with artificially induced changes were correctly identified as "evolved."
Total body photography is used for early detection of malignant melanoma, primarily as a means of temporal skin surface monitoring. In prior work, we presented a scanner with a set of algorithms to map and detect changes in pigmented skin lesions, thus demonstrating that it is possible to fully automate the process of total body image acquisition and processing. The key procedure in these algorithms is skin lesion matching which determines whether two images depict the same real lesion. In this paper, we aim to improve it with respect to false positive and negative outcomes. To this end, we developed two novel methods: one based on successive rigid transformations of 3-D point clouds and one based on non-rigid coordinate plane deformations in regions of interest around the lesions. In both approaches, we applied a robust outlier rejection procedure based on progressive graph matching. Using the scanner's images, we created a ground truth dataset tailored to diversify false positive match scenarios. The algorithms were evaluated according to their precision and recall values, and the results demonstrated the superiority of the second approach in all the tests. In the complete inter-positional matching experiment, it reached a precision and recall as high as 99.92% and 81.65% respectively, showing a significant improvement over our original method.