AbstractImpedance cardiography (ICG) is a non-invasive method to evaluate several cardiodynamic parameters by measuring the cardiac-synchronous changes in the dynamic transthoracic electrical impedance. ICG allows us to identify and quantify conductivity changes inside the thorax by measuring the impedance on the thorax during a cardiac cycle. Pathologic changes in the aorta, like aortic dissection, will alter the aortic shape as well as the blood flow and consequently, the impedance cardiogram. This fact distorts the evaluated cardiodynamic parameters, but it could lead to the possibility to identify aortic pathology. A 3D numerical simulation model is used to compute the impedance changes on the thorax surface in case of the type B aortic dissection. A sensitivity analysis is applied using this simulation model to investigate the suitability of different electrode configurations considering several patient-specific cases. Results show that the remarkable pathological changes in the aorta caused by aortic dissection alters the impedance cardiogram significantly.
Aortic dissection is a cardiovascular disease with a disconcertingly high mortality. When it comes to diagnosis, medical imaging techniques such as Computed Tomography, Magnetic Resonance Tomography or Ultrasound certainly do the job, but also have their shortcomings. Impedance cardiography is a standard method to monitor a patients heart function and circulatory system by injecting electric currents and measuring voltage drops between electrode pairs attached to the human body. If such measurements distinguished healthy from dissected aortas, one could improve clinical procedures. Experiments are quite difficult, and thus we investigate the feasibility with finite element simulations beforehand. In these simulations, we find uncertain input parameters, e.g., the electrical conductivity of blood. Inference on the state of the aorta from impedance measurements defines an inverse problem in which forward uncertainty propagation through the simulation with vanilla Monte Carlo demands a prohibitively large computational effort. To overcome this limitation, we combine two simulations: one simulation with a high fidelity and another simulation with a low fidelity, and low and high computational costs accordingly. We use the inexpensive low-fidelity simulation to learn about the expensive high-fidelity simulation. It all boils down to a regression problem—and reduces total computational cost after all.
In 2000, Kennedy and O’Hagan proposed a model for uncertainty quantification that combines data of several levels of sophistication, fidelity, quality, or accuracy, e.g., a coarse and a fine mesh in finite-element simulations. They assumed each level to be describable by a Gaussian process, and used low-fidelity simulations to improve inference on costly high-fidelity simulations. Departing from there, we move away from the common non-Bayesian practice of optimization and marginalize the parameters instead. Thus, we avoid the awkward logical dilemma of having to choose parameters and of neglecting that choice’s uncertainty. We propagate the parameter uncertainties by averaging the predictions and the prediction uncertainties over all the possible parameters. This is done analytically for all but the nonlinear or inseparable kernel function parameters. What is left is a low-dimensional and feasible numerical integral depending on the choice of kernels, thus allowing for a fully Bayesian treatment. By quantifying the uncertainties of the parameters themselves too, we show that “learning” or optimising those parameters has little meaning when data is little and, thus, justify all our mathematical efforts. The recent hype about machine learning has long spilled over to computational engineering but fails to acknowledge that machine learning is a big data problem and that, in computational engineering, we usually face a little data problem. We devise the fully Bayesian uncertainty quantification method in a notation following the tradition of E.T. Jaynes and find that generalization to an arbitrary number of levels of fidelity and parallelisation becomes rather easy. We scrutinize the method with mock data and demonstrate its advantages in its natural application where high-fidelity data is little but low-fidelity data is not. We then apply the method to quantify the uncertainties in finite element simulations of impedance cardiography of aortic dissection. Aortic dissection is a cardiovascular disease that frequently requires immediate surgical treatment and, thus, a fast diagnosis before. While traditional medical imaging techniques such as computed tomography, magnetic resonance tomography, or echocardiography certainly do the job, Impedance cardiography too is a clinical standard tool and promises to allow earlier diagnoses as well as to detect patients that otherwise go under the radar for too long.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.