Due to current trends in the manufacturing industry, such as mass customization, manual operations contribute drastically to the overall costs of a product. Methods-Time-Measurement (MTM) identifies the optimization potential of manual workplaces, which significantly influences a worker’s productivity. However, traditional MTM requires great efforts to observe and transcribe manual assembly processes. Yet, various digital approaches exist that facilitate MTM analyses. While most of these approaches require the existence of real workplaces or cardboard mock-ups, it would be beneficial to conduct a virtual MTM in earlier phases of production planning. However, the quality of virtual MTM analyses compared to traditional MTM conducted in reality has not been assessed yet. This paper is addressing it by conducting a comparative user study with 21 participants completing the same task both at a real and virtual workplace, which they access via virtual reality technology. Our results show that participants’ MTM-2 values achieved at the VR workplace are comparable to those at the real workplace. However, time study data reveals that participants moved considerably slower in VR and thus needed more time to accomplish the task. Consequently, for the measurement of manual work in VR, it is even necessary to utilize predetermined times, such as MTM-2 since time study data is insufficient. This paper also serves as a proof of concept for future studies, investigating automated transcription systems that would further decrease the efforts conducting MTM analyses.
Despite the high level of automation in industrial production, manual operations still play an important role and contribute significantly to the overall production costs. For the evaluation of these manual processes the ``Methods-Time Measurement'' (MTM) is widely used. This method is applied to real workplaces or mock-ups thereof, while also Virtual Reality (VR) can be used for the representation of such workplaces. However, the evaluation of the workers' performed actions is still done manually, which is a time-consuming and error-prone process. This paper introduces an approach to automatically detect full-body actions of users in VR and consequently derive the appropriate MTM values, without knowledge of a pre-existing workplan. The detection algorithm that was developed is explained in detail and its performance is analyzed through a user study with 30 participants.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.