The method provides motion information with sufficiently high spatial and temporal resolution. Thus, it enables meaningful analysis in the form of comparisons between amplitudes and phase shifts of signals from different regions. In combination with a large field-of-view, as given by combining the data of two Kinect cameras, it yields surface representations that might be useful in the context of motion correction and motion modeling.
PET attenuation correction for flexible MRI radio frequency surface coils in hybrid PET/MRI is still a challenging task, as position and shape of these coils conform to large inter-patient variabilities. The purpose of this feasibility study is to develop a novel method for the incorporation of attenuation information about flexible surface coils in PET reconstruction using the Microsoft Kinect V2 depth camera. The depth information is used to determine a dense point cloud of the coil's surface representing the shape of the coil. From a CT template-acquired once in advance-surface information of the coil is extracted likewise and converted into a point cloud. The two point clouds are then registered using a combination of an iterative-closest-point (ICP) method and a partially rigid registration step. Using the transformation derived through the point clouds, the CT template is warped and thereby adapted to the PET/MRI scan setup. The transformed CT template is converted into an attenuation map from Hounsfield units into linear attenuation coefficients. The resulting fitted attenuation map is then integrated into the MRI-based patient-specific DIXON-based attenuation map of the actual PET/MRI scan. A reconstruction of phantom PET data acquired with the coil present in the field-of-view (FoV), but without the corresponding coil attenuation map, shows large artifacts in regions close to the coil. The overall count loss is determined to be around 13% compared to a PET scan without the coil present in the FoV. A reconstruction using the new μ-map resulted in strongly reduced artifacts as well as increased overall PET intensities with a remaining relative difference of about 1% to a PET scan without the coil in the FoV.
Physiological motion combined with elongated scanning times in PET leads to image degradation and quantification errors. Correction approaches usually require 1-D signals that can be obtained with hardware-based or data-driven methods. Most of the latter are optimized or limited to capture internal motion along the superior-inferior (S-I) direction. In this work we present methods for also extracting anterior-posterior (A-P) motion from PET data and propose a set of novel weighting mechanisms that can be used to emphasize certain lines-of-response (LORs) for an increased sensitivity and better signal-to-noise ratio (SNR). The proper functioning of the methods was verified in a phantom experiment. Further, their application to clinical [18]-FDG-PET data of 72 patients revealed that using the weighting mechanisms leads to signals with significantly higher spectral respiratory weights, i.e. signals with higher quality. Information about multi-dimensional motion is contained in PET data and can be derived with data-driven methods. Motion models or correction techniques such as respiratory gating might benefit from the proposed methods as they allow to describe the three-dimensional movements of PET-positive structures more precisely.
Accurate respiratory signals were obtained successfully by the proposed method with high spatial and temporal resolution. By working without contact and passing through clothing and blankets, this approach minimizes preparation time and increases the convenience of the patient during the scan.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.