Ultrasound is an essential imaging modality in clinical screening and diagnosis, for reducing morbidity and improving quality of life. Successfully performing ultrasound imaging, however, requires extensive training and expertise in navigating a hand-held probe to a correct anatomical location as well as subsequently interpreting the acquired image. Computergenerated simulations can offer a safe, flexible, and standardized environment to train such skills. Data-based simulations display interpolated slices from a-priori-acquired real ultrasound volumes, whereas generative simulations aim to reproduce the complex ultrasound interactions with comprehensive, geometric anatomical models, such as using ray-tracing to mimic acoustic propagation. Although sonographers typically focus on relatively smaller structures of interest in ultrasound images, the fidelity of the background anatomy may still play a role in contributing to the realism of a generated US image; e.g. when imaging a relatively smaller fetus within large abdominal background. It was proposed earlier to compose ray-traced images with acquired volumes in a preprocessing step. Despite its simplicity, this prevents any view-dependent artifacts and interactive model changes, such as those induced by animations, which can, for instance, model fetal motion. To fully leverage the flexibility of the model-based generative approach, we propose herein an onthe-fly image fusion, based on the two techniques, by moving the interpolation stage within the ray-tracer, such that the preacquired image data can be referred to in the background, while the acoustic interactions with the model can be resolved in the foreground. This allows for animated anatomical models, which we realize during simulation runtime via scene-hierarchy subtree switching between precomputed acceleration structure graphs. We demonstrate our proposed techniques on ultrasound sequences of fetal and heart motion, where only animated models can afford to meet realism requirements entailed by the temporal domain.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.