Virtual Reality, an immersive technology that replicates an environment via computer-simulated reality, gets a lot of attention in the entertainment industry. However, VR has also great potential in other areas, like the medical domain, Examples are intervention planning, training and simulation. This is especially of use in medical operations, where an aesthetic outcome is important, like for facial surgeries. Alas, importing medical data into Virtual Reality devices is not necessarily trivial, in particular, when a direct connection to a proprietary application is desired. Moreover, most researcher do not build their medical applications from scratch, but rather leverage platforms like MeVisLab, MITK, OsiriX or 3D Slicer. These platforms have in common that they use libraries like ITK and VTK, and provide a convenient graphical interface. However, ITK and VTK do not support Virtual Reality directly. In this study, the usage of a Virtual Reality device for medical data under the MeVisLab platform is presented. The OpenVR library is integrated into the MeVisLab platform, allowing a direct and uncomplicated usage of the head mounted display HTC Vive inside the MeVisLab platform. Medical data coming from other MeVisLab modules can directly be connected per drag-and-drop to the Virtual Reality module, rendering the data inside the HTC Vive for immersive virtual reality inspection.
In this publication, the interactive planning and reconstruction of cranial 3D Implants under the medical prototyping platform MeVisLab as alternative to commercial planning software is introduced. In doing so, a MeVisLab prototype consisting of a customized data-flow network and an own C++ module was set up. As a result, the Computer-Aided Design (CAD) software prototype guides a user through the whole workflow to generate an implant. Therefore, the workflow begins with loading and mirroring the patients head for an initial curvature of the implant. Then, the user can perform an additional Laplacian smoothing, followed by a Delaunay triangulation. The result is an aesthetic looking and well-fitting 3D implant, which can be stored in a CAD file format, e.g. STereoLithography (STL), for 3D printing. The 3D printed implant can finally be used for an in-depth pre-surgical evaluation or even as a real implant for the patient. In a nutshell, our research and development shows that a customized MeVisLab software prototype can be used as an alternative to complex commercial planning software, which may also not be available in every clinic. Finally, not to conform ourselves directly to available commercial software and look for other options that might improve the workflow.
Study Objectives The differentiation of isolated rapid eye movement (REM) sleep behavior disorder (iRBD) or its prodromal phase (prodromal RBD) from other disorders with motor activity during sleep is critical for identifying α-synucleinopathy in an early stage. Currently, definite RBD diagnosis requires video polysomnography (vPSG). The aim of this study was to evaluate automated 3D video analysis of leg movements during REM sleep as objective diagnostic tool for iRBD. Methods A total of 122 participants (40 iRBD, 18 prodromal RBD, 64 participants with other disorders with motor activity during sleep) were recruited among patients undergoing vPSG at the Sleep Disorders Unit, Department of Neurology, Medical University of Innsbruck. 3D videos synchronous to vPSG were recorded. Lower limb movements rate, duration, extent, and intensity were computed using a newly developed software. Results The analyzed 3D movement features were significantly increased in subjects with iRBD compared to prodromal RBD and other disorders with motor activity during sleep. Minor leg jerks with a duration < 2 seconds discriminated with the highest accuracy (90.4%) iRBD from other motor activity during sleep. Automatic 3D analysis did not differentiate between prodromal RBD and other disorders with motor activity during sleep. Conclusions Automated 3D video analysis of leg movements during REM sleep is a promising diagnostic tool for identifying subjects with iRBD in a sleep laboratory population and is able to distinguish iRBD from subjects with other motor activities during sleep. For future application as a screening, further studies should investigate usefulness of this tool when no information about sleep stages from vPSG is available and in the home environment.
Background: Unlike other episodic sleep disorders in childhood, there are no agreed severity indices for rhythmic movement disorder. While movements can be characterized in detail by polysomnography, in our experience most children inhibit rhythmic movement during polysomnography. Actigraphy and home video allow assessment in the child’s own environment, but both have limitations. Standard actigraphy analysis algorithms fail to differentiate rhythmic movements from other movements. Manual annotation of 2D video is time consuming. We aimed to develop a sensitive, reliable method to detect and quantify rhythmic movements using marker free and automatic 3D video analysis. Method: Patients with rhythmic movement disorder (n = 6, 4 male) between age 5 and 14 years (M: 9.0 years, SD: 4.2 years) spent three nights in the sleep laboratory as part of a feasibility study (). 2D and 3D video data recorded during the adaptation and baseline nights were analyzed. One ceiling-mounted camera captured 3D depth images, while another recorded 2D video. We developed algorithms to analyze the characteristics of rhythmic movements and built a classifier to distinguish between rhythmic and non-rhythmic movements based on 3D video data alone. Data from 3D automated analysis were compared to manual 2D video annotations to assess algorithm performance. Novel indices were developed, specifically the rhythmic movement index, frequency index, and duration index, to better characterize severity of rhythmic movement disorder in children. Result: Automatic 3D video analysis demonstrated high levels of agreement with the manual approach indicated by a Cohen’s kappa >0.9 and F1-score >0.9. We also demonstrated how rhythmic movement assessment can be improved using newly introduced indices illustrated with plots for ease of visualization. Conclusion: 3D video technology is widely available and can be readily integrated into sleep laboratory settings. Our automatic 3D video analysis algorithm yields reliable quantitative information about rhythmic movements, reducing the burden of manual scoring. Furthermore, we propose novel rhythmic movement disorder severity indices that offer a means to standardize measurement of this disorder in both clinical and research practice. The significance of the results is limited due to the nature of a feasibility study and its small number of samples. A larger follow up study is needed to confirm presented results.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.