[18F]FDOPA PET imaging has shown dopaminergic function indexed as Kicer differs between antipsychotic treatment responders and non-responders. However, the theragnostic potential of this biomarker to identify non-responders has yet to be evaluated. In view of this, we aimed to evaluate this as a theragnostic test using linear and non-linear machine-learning (i.e., Bernoulli, support vector, random forest and Gaussian processes) analyses and to develop and evaluate a simplified approach, standardised uptake value ratio (SUVRc). Both [18F]FDOPA PET approaches had good test-rest reproducibility across striatal regions (Kicer ICC: 0.68–0.94, SUVRc ICC: 0.76–0.91). Both our linear and non-linear classification models showed good predictive power to distinguish responders from non-responders (receiver operating curve area under the curve for region-of-interest approach: Kicer = 0.80, SUVRc = 0.79; for voxel-wise approach using a linear support vector machine: 0.88) and similar sensitivity for identifying treatment non-responders with 100% specificity (Kicer: ~50%, SUVRc: 40–60%). Although the findings were replicated in two independent datasets, given the total sample size (n = 84) and single setting, they warrant testing in other samples and settings. Preliminary economic analysis of [18F]FDOPA PET to fast-track treatment-resistant patients with schizophrenia to clozapine indicated a potential healthcare cost saving of ~£3400 (equivalent to $4232 USD) per patient. These findings indicate [18F]FDOPA PET dopamine imaging has potential as biomarker to guide treatment choice.
We introduce GPflux, a Python library for Bayesian deep learning with a strong emphasis on deep Gaussian processes (DGPs). Implementing DGPs is a challenging endeavour due to the various mathematical subtleties that arise when dealing with multivariate Gaussian distributions and the complex bookkeeping of indices. To date, there are no actively maintained, open-sourced and extendable libraries available that support research activities in this area. GPflux aims to fill this gap by providing a library with state-of-the-art DGP algorithms, as well as building blocks for implementing novel Bayesian and GP-based hierarchical models and inference schemes. GPflux is compatible with and built on top of the Keras deep learning eco-system. This enables practitioners to leverage tools from the deep learning community for building and training customised Bayesian models, and create hierarchical models that consist of Bayesian and standard neural network layers in a single coherent framework. GPflux relies on GPflow for most of its GP objects and operations, which makes it an efficient, modular and extensible library, while having a lean codebase.
The natural gradient method has been used effectively in conjugate Gaussian process models, but the non-conjugate case has been largely unexplored. We examine how natural gradients can be used in non-conjugate stochastic settings, together with hyperparameter learning. We conclude that the natural gradient can significantly improve performance in terms of wall-clock time. For illconditioned posteriors the benefit of the natural gradient method is especially pronounced, and we demonstrate a practical setting where ordinary gradients are unusable. We show how natural gradients can be computed efficiently and automatically in any parameterization, using automatic differentiation. Our code is integrated into the GPflow package.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.