There is no real need to discuss the potential advantages – mainly the excellent soft tissue contrast, nonionizing radiation, flow, and molecular information – of magnetic resonance imaging (MRI) as an intraoperative diagnosis and therapy system particularly for neurological applications and oncological therapies. Difficult patient access in conventional horizontal-field superconductive magnets, very high investment and operational expenses, and the need for special nonferromagnetic therapy tools have however prevented the widespread use of MRI as imaging and guidance tool for therapy purposes. The interventional use of MRI systems follows for the last 20+ years the strategy to use standard diagnostic systems and add more or less complicated and expensive components (eg, MRI-compatible robotic systems, specially shielded in-room monitors, dedicated tools and devices made from low-susceptibility materials, etc) to overcome the difficulties in the therapy process. We are proposing to rethink that approach using an in-room portable ultrasound (US) system that can be safely operated till 1 m away from the opening of a 3T imaging system. The live US images can be tracked using an optical inside–out approach adding a camera to the US probe in combination with optical reference markers to allow direct fusion with the MRI images inside the MRI suite. This leads to a comfortable US-guided intervention and excellent patient access directly on the MRI patient bed. This was combined with an entirely mechanical MRI-compatible 7 degrees of freedom holding arm concept, which shows that this test environment is a different way to create a cost-efficient and effective setup that combines the advantages of MRI and US by largely avoiding the drawbacks of current interventional MRI concepts.
Performing minimal invasive interventions under real-time image guidance proves problematic in a closed-bore magnetic resonance imaging scanner. To enable better usability in MRI guided interventions, robotic systems could be used for additional assistance. However, the integration of such devices into the clinical workflow relates to many technical challenges in order to increase precision of the procedure while ensuring the overall safety. In this work, an MR compatible, compact, ultra-light and remotely controllable micropositioning system called μRIGS is presented. The instrument positioning unit can be operated in a 5-DoF range within a working volume of 2100 cm3with an instrument feed of 120 mm. The kinematics are actuated with a combination of non-metallic Bowden cables and electric stepper motors from a safe distance inside the scanner room, while their control is initiated from the control room via a custom-fitted GUI. Thereby, the precision of the positioning reproducibility of the respective DoF can be achieved with a mean deviation of 0.12 °. Furthermore, a feed force of 14 N can be provided to puncture various soft tissue.
During a magnetic resonance imaging (MRI) exam, a respiratory signal can be required for different purposes, e.g. for patient monitoring, motion compensation or for research studies such as in functional MRI. In addition, respiratory information can be used as a biofeedback for the patient in order to control breath holds or shallow breathing. To reduce patient preparation time or distortions of the MR imaging system, we propose the use of a contactless approach for gathering information about the respiratory activity. An experimental setup based on a commercially available laser range sensor was used to detect respiratory induced motion of the chest or abdomen. This setup was tested using a motion phantom and different human subjects in an MRI scanner. A nasal airflow sensor served as a reference. For both, the phantom as well as the different human subjects, the motion frequency was precisely measured. These results show that a low cost, contactless, laser-based approach can be used to obtain information about the respiratory motion during an MRI exam.
MRI-guided interventions (iMRI, e.g. prostate biopsy) are usually performed under a free-hand instrument targeting approach to guide and feed the instrument (e.g. biopsy needle) along the desired trajectory. However, this technique requires many iterative movements either from the interventionist or of the patient (with the MRI table) in and out of the MRI tunnel, which is cumbersome, time-consuming and expensive. To overcome these downsides, interventional MRI procedures can be facilitated with remotely controllable assistance systems for instrument alignment. Such systems require an accurate registration and tracking of the position and orientation of the instrument. Passive fiducial marker frames (e.g. additively manufactured Z-frame marker) are capable of providing full information about a device´s orientation within the image. In this paper, we present an automated alignment detection algorithm to track a Z-marker from system-independent screen-captured images. We evaluated the precision of the detection algorithm by analysing its computation results from schematic gold standard images in different marker orientations in comparison to MR-images with the same orientations. Our combined setup consists of a precise alignment system, an additively manufactured Z-frame marker and the related detection algorithm. It offers a fast, simple, independent and accurate automated instrument targeting for iMRI. For future work, we plan to conduct phantom targeting tests in combination with robotic alignment systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.