This paper study is related to two research areas, namely 3D computer graphics and augmented reality with a combination of their display on mobile devices. It presents the creation of three different interactive 3D models based on a realistically drawn image of domestic animals and can be displayed on mobile devices using augmented reality. The textured animals' models are displayed in the application Augmented animals (slo. Obogatene živali) with a simple user interface. The usability of the application is demonstrated by the detection of the image target, i.e., a printed interactive card, which proves the interaction between the mobile device and the augmented paper. When the mobile device camera recognizes the target, it displays the selected animal on the screen. The result is the enhancement of the real environment with animated 3D characters. By displaying a 3D character on the screen and interacting with the user interface, the presentation of each animal in three different animated movements is enabled. The first empirical part of this work was done with the help of the Blender program, in which we created all three animal 3D characters. First, we had to model all the animals from the initial templates into a recognizable 3D mesh, which we then mapped the textures on. This was followed by the construction of a system of bones and animation controls, based on which we could create the animal animations. After this step, we transferred the project to the Unity program. Then it followed the construction of an application that allows the representation of characters in augmented reality. The results of the entire work are appropriately made animal characters in the form of animated 3D models that can be displayed in augmented reality mode on mobile devices using interactive cards. The selected testing parameters showed that there are certain differences in rendering between the two tested mobile devices depending on the selected subdivision level of the 3D character. However, for recognition based on lighting conditions, distance and slope between the image target and the mobile device, the best user experience is obtained when the image target is captured from a distance of 15-20 cm and from a bird's eye view under good lighting conditions.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.