Behaviors, actions, pose, facial expressions, and speech are considered as channels that convey human emotions. Extensive research has been carried out to explore the relationships between these channels and emotions. The proposed method consists of a neural network-based solution combined with image processing and speech processing to classify the universal emotions: happy, anger, sad, and neutral. Speech processing includes extraction of spectral and temporal features like MFCC, energy, and then a set of values is given as input to the neural network. In image processing, Gabor filter texture features are used to extract a set of selected feature points. Mutual information is calculated and given as an input to the neural network for classification. The experimental results demonstrate the efficacy of audio-visual cues especially using few prominent features as overall accuracy of the combined approach is above 85%.
Visually impaired people face a lot of challenges while choosing clothes with complex patterns and colors. Rotation, scaling and variation in the light makes the cloth recognition problem a challenging task. An automatic cloth pattern recognition technique to classify the patterns into four classes namely plaid, striped, irregular and Patternless is developed using image processing, machine learning and deep learning concepts in this work. MATLAB is used as the simulation tool of choice. Color classification is done with the help of Hue Saturation Intensity (HSI) color model. To recognize clothing patterns, global and local features are extracted. Features extracted include Radon signatures and Grey Level Co-occurrence matrix. Pattern recognition has been done with the help of machine algorithms such as KNN, SVM, and deep learning networks such as AlexNet, GoogleNet, VGG-16 and VGG-19. To evaluate the effectiveness of the algorithms, CCNY Clothing Pattern data-set has been used. The maximum accuracy of 97.9% was obtained using the VGG-19 deep neural network.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.