In this paper, a new multi-view human action recognition approach is proposed by exploiting low dimensional motion information of actions. Before feature extraction, pre-processing steps are performed to remove noise from silhouettes, incurred due to imperfect but realistic segmentation. 2D motion templates based on Motion History Image (MHI) are computed for each view/action video which can cope with the high-dimensionality issue, incurred due to multi-camera data. Histograms of Oriented Gradients (HOGs) are used as an efficient description of the MHIs. Finally, a Nearest Neighbor (NN) classifier is employed for the classification of the HOG based description of MHIs. As compared to existing approaches, the proposed method has three advantages: 1) does not require a fixed number of cameras setup during training and testing stages hence missing camera-views can be tolerated, 2) requires less memory and bandwidth requirements and hence 3) is computationally efficient which makes it suitable for real-time action recognition. The proposed method is evaluated on the new MuHAVi-uncut dataset having a large number of action categories and a large set of camera-views with noisy silhouettes. As far as we know, this is the first report of results on that dataset and can be used by future workers as a baseline to improve upon. Experimentation results on multi-view with this dataset gives a high accuracy rate of 95.4% using Leave-One-Sequence-Out (LOSO) cross validation technique and compares well to similar state-of-the-art approaches.
Melanoma is the skin cancer caused by the ultraviolet radiation from the Sun and has only 15-20% of survival rate. Late diagnosis of melanoma leads to the severe malignancy of disease, and metastasis expands to the other body organs i.e. liver, lungs and brain. The dermatologists analyze the pigmented lesions over the skin to discriminate melanoma from other skin diseases. However, the imprecise analysis results in the form of a series of biopsies and it complicates the treatment. Meanwhile, the process of melanoma detection can be expedited through computer vision methods by analyzing the dermoscopic images automatically. However, the visual similarity between the normal and infected skin regions, and artifacts like gel bubbles, hair and clinical marks indicate low accuracy rates for these approaches. To overcome these challenges, in this article, a melanoma detection and segmentation approach is presented that brings significant improvement in terms of accuracy against state-of-the-art approaches. As a first step, the artifacts like hairs, gel bubbles, and clinical marks are removed from the dermoscopic images by applying the morphological operations, and image regions are sharpen. Afterwards, for infected region detection, we used YOLOv4 object detector by tuning it for melanoma detection to discriminate the highly correlated infected and non-infected regions. Once the bounding boxes against the melanoma regions are obtained, the infected melanoma regions are extracted by applying the active contour segmentation approach. For performance evaluation, the proposed approach is evaluated on ISIC2018 and ISIC2016 datasets and results are compared against state-of-the-art melanoma detection, and segmentation techniques. Our proposed approach achieves average dice score as 1 and Jaccard coefficient as 0.989. The segmentation result validates the practical bearing of our method in development of clinical decision support system for melanoma diagnosis in contrast to state-of-the-art methods. The YOLOv4 detector is capable to detect multiple skin diseases of same patient and multiple diseases of various patients.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.