While Mixed Prototyping has proved to be e↵ective for the assessment of prototypes, this research aims to explore the use of Mixed Prototyping for the generation of early prototypes. To satisfy end-user's needs, new products need to be designed with an early integration of end-user requirements. An e cient way to achieve this is to directly integrate the end-users in the design process and give them an intelligible and interactive tool to perform specific design tasks. Current interactive tools to integrate end-users in the design process provide either a high level of immersion (e.g. CAVE) or a high level of control over the virtual prototype (e.g. Configurators). We designed a new Mixed Reality design tool which simultaneously allows end-users to be immersed in a virtual environment (immersion) and to interact with a virtual prototype and to modify it (control), resulting in e↵ective end user-interactions. In two design use-case scenarios, we assessed the enduser experience and satisfaction while using the tool and we also evaluated the impact of the tool on the creative process and the design outcomes. The findings show that, when users are provided with a tool that allows to directly perform design tasks and modify a virtual prototype, as compared to when they have no control, they are more engaged in the design tasks, more satisfied with the design process and they produce more creative outcomes.
We designed a new Computer-Aided Design tool that can be easily and intuitively used by non-expert designers, like users of the products being designed. The target application is the design of highly-customized products together with the final users, more specifically the design of walking assistive devices with mobility-impaired people. The tool has simultaneously been developed with an ad-hoc protocol for an accurate evaluation of the satisfaction of users, through questionnaires and psychophysiological measurements. In fact, costly and complex technical products such as walking assistance devices require ad-hoc design processes to address the specific needs of each user. The characterization of user requirements in the early stage of design remains difficult due to their subjective and communication gap between the user and the designer. To overcome these issues, we propose a new modular digital toolbox that allows co-design between users and designers. The tool is a combination of a mixed reality hardware/software system and kansei (or affective) engineering techniques. The hardware consists of modular Tangible User Interfaces (TUIs), custom-made by 3D printing and powered by a 3D game engine. The interactive content is displayed in mixed reality, simultaneously to the user and the designer. Kansei data of the users are collected through questionnaires and psychophysical measurements, during multiple collaboration phases.
This article proposes a novel approach towards quick concept generation and validation of interactive-product behaviours. When designing for user-product interactions, designers have to consider spatial and behavioural elements besides form/tangible aspects and perform quick-validation of the generated concepts often done through functional prototyping at later design stages. As a result, the designed-outcomes often depend on parameters like designer's familiarity with the design tools used, the level of fidelity achieved while prototyping and the frequency of design-iterations, limiting a thorough-exploration of concept-space and outcomes' creativity at the early design stages. This research targets such dependencies and non-creative hindrances at concept generation stage through a Mixed Reality implementation. This work establishes requirements for creating a suitable design-tool and presents a proof-of-concept use-case. A design task to ideate, create and revise concepts of playful product-behaviours swiftly was performed to assess the impact of the implemented method. In an empirical study, a broader exploration of solution-space and an overall improvement in creative output-flow was observed when compared to the design-outcomes in the traditional storyboard design approach. Though the implemented design-tool's unfamiliarity and capability presented a challenge, a significant increase in usage of iterative concept-design behaviour was observed throughout the study.
Part 14: Design and Integration IssuesInternational audienceSome costly and complex technical products, such as walking assistance devices, require ad-hoc design processes to address the very specific needs of each user. However, the depiction of customer/user requirements in the early stage of design stands difficult due to their subjective nature and the separation between the user and the designer. To bridge these gaps, we introduce the definition of a new modular digital toolbox based upon mixed reality system and kansei engineering techniques. The hardware consists in modular Tangible User Interfaces (TUIs), custom made by 3D printing and powered by a 3D game engine. The interactive content is displayed in mixed reality, simultaneously to the user and the designer. Kansei data are collected through questionnaires and psychophysical measurements, during multiple collaboration phases. The modularity of the system allows the evaluation of various TUIs, 3D content behaviours and the best fitting type of display
To design interactive behaviours for their products designers/makers have to use high fidelity tools like ‘electronic prototyping kits’, involving sensors and programming to incorporate interactions in their products and are dependent on availability of hardware. Not every designer is comfortable using such tools to ideate and test their concept ideas, eventually slowing them down in the process. Thus, there is a need for a design tool that reduces dependence on complex components of such tools while exploring new concepts for product design at an early stage. In this work, we propose a Mixed Reality system that we developed to simulate interactive behaviours of products using designed visual interaction blocks. The system is implemented in three stages: idea generation, creating interactions and revision of interactive behaviours. The implemented virtual scenario showed to elicit high motivation and appeal among users resulting in inventive and creative design experience at the same time. As a result, designers will be able to create and revise their interaction-behavioural design concepts virtually with relative ease, resulting in higher concept generation and their validation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.