Variant approaches used to release scents in most recent olfactory displays rely on time for decision making. The applicability of such an approach is questionable in scenarios like video games or virtual reality applications, where the specific content is dynamic in nature and thus not known in advance. All of these are required to enhance the experience and involvement of the user while watching or participating virtually in 4D cinemas or fun parks, associated with short films. Recently, associating the release of scents to the visual content of the scenario has been studied. This research enhances one such work by considering the auditory content along with the visual content. Minecraft, a computer game, was used to collect the necessary dataset with 1200 audio segments. The Inception v3 model was used to classified the sound and image dataset. Further ground truth classification on this dataset resulted in four classes: grass, fire, thunder, and zombie. Higher accuracies of 91% and 94% were achieved using the transfer learning approach for the sound and image models, respectively.
Although olfaction can enhance the user’s experience in virtual environments, the approach is not widely utilized by virtual contents. This is because the olfaction displays are either not aware of the content in the virtual world or they are application specific. Enabling wide context awareness is possible through the use of image recognition via machine learning. Screenshots from the virtual worlds can be analyzed for the presence of virtual scent emitters, allowing the olfactory display to respond by generating the corresponding smells. The Convolutional Neural Network (CNN), using Inception Model for image recognition was used for training the system. To evaluate the performance of the accuracy of the model, we trained it on a computer game called Minecraft. The results and performance of the model was 97% accurate, while in some cases the accuracy reached 99%.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.