Brain age prediction based on imaging data and machine learning (ML) methods has great potential to provide insights into the development of cognition and mental disorders. Though different ML models have been proposed, a systematic comparison of ML models in combination with imaging features derived from different modalities is still needed. In this study, we evaluate the prediction performance of 36 combinations of imaging features and ML models including deep learning. We utilize single and multimodal brain imaging data including MRI, DTI, and rs‐fMRI from a large data set with 839 subjects. Our study is a follow‐up to the initial work (Liang et al., 2019. Human Brain Mapping) to investigate different analytic strategies to combine data from MRI, DTI, and rs‐fMRI with the goal to improve brain age prediction accuracy. Additionally, the traditional approach to predicting the brain age gap has been shown to have a systematic bias. The potential nonlinear relationship between the brain age gap and chronological age has not been thoroughly tested. Here we propose a new method to correct the systematic bias of brain age gap by taking gender, chronological age, and their interactions into consideration. As the true brain age is unknown and may deviate from chronological age, we further examine whether various levels of behavioral performance across subjects predict their brain age estimated from neuroimaging data. This is an important step to quantify the practical implication of brain age prediction. Our findings are helpful to advance the practice of optimizing different analytic methodologies in brain age prediction.
Over the past decade, there has been an abundance of research on the difference between age and age predicted using brain features, which is commonly referred to as the “brain age gap.” Researchers have identified that the brain age gap, as a linear transformation of an out‐of‐sample residual, is dependent on age. As such, any group differences on the brain age gap could simply be due to group differences on age. To mitigate the brain age gap's dependence on age, it has been proposed that age be regressed out of the brain age gap. If this modified brain age gap is treated as a corrected deviation from age, model accuracy statistics such as R2 will be artificially inflated to the extent that it is highly improbable that an R2 value below .85 will be obtained no matter the true model accuracy. Given the limitations of proposed brain age analyses, further theoretical work is warranted to determine the best way to quantify deviation from normality.
BackgroundDiet plays an important role in chronic disease, and the use of dietary pattern analysis has grown rapidly as a way of deconstructing the complexity of nutritional intake and its relation to health. Pattern analysis methods, such as principal component analysis (PCA), have been used to investigate various dimensions of diet. Existing analytic methods, however, do not fully utilize the predictive potential of dietary assessment data. In particular, these methods are often suboptimal at predicting clinically important variables.MethodsWe propose a new dietary pattern analysis method using the advanced LASSO (Least Absolute Shrinkage and Selection Operator) model to improve the prediction of disease-related risk factors. Despite the potential advantages of LASSO, this is the first time that the model has been adapted for dietary pattern analysis. Hence, the systematic evaluation of the LASSO model as applied to dietary data and health outcomes is highly innovative and novel. Using Food Frequency Questionnaire data from NHANES 2005–2006, we apply PCA and LASSO to identify dietary patterns related to cardiovascular disease risk factors in healthy US adults (n = 2609) after controlling for confounding variables (e.g., age and BMI). Both analyses account for the sampling weights. Model performance in terms of prediction accuracy is evaluated using an independent test set.ResultsPCA yields 10 principal components (PCs) that together account for 65% of the variation in the data set and represent distinct dietary patterns. These PCs are then used as predictors in a regression model to predict cardiovascular disease risk factors. We find that LASSO better predicts levels of triglycerides, LDL cholesterol, HDL cholesterol, and total cholesterol (adjusted R2 = 0.861, 0.899, 0.890, and 0.935 respectively) than does the traditional, linear-regression-based, dietary pattern analysis method (adjusted R2 = 0.163, 0.005, 0.235, and 0.024 respectively) when the latter is applied to components derived from PCA.ConclusionsThe proposed method is shown to be an appropriate and promising statistical means of deriving dietary patterns predictive of cardiovascular disease risk. Future studies, involving different diseases and risk factors, will be necessary before LASSO’s broader usefulness in nutritional epidemiology can be established.
Background Behavioral obesity treatment (BOT) is a gold standard approach to weight loss and reduces the risk of cardiovascular disease. However, frequent lapses from the recommended diet stymie weight loss and prevent individuals from actualizing the health benefits of BOT. There is a need for innovative treatment solutions to improve adherence to the prescribed diet in BOT. Objective The aim of this study is to optimize a smartphone-based just-in-time adaptive intervention (JITAI) that uses daily surveys to assess triggers for dietary lapses and deliver interventions when the risk of lapse is high. A microrandomized trial design will evaluate the efficacy of any interventions (ie, theory-driven or a generic alert to risk) on the proximal outcome of lapses during BOT, compare the effects of theory-driven interventions with generic risk alerts on the proximal outcome of lapse, and examine contextual moderators of interventions. Methods Adults with overweight or obesity and cardiovascular disease risk (n=159) will participate in a 6-month web-based BOT while using the JITAI to prevent dietary lapses. Each time the JITAI detects elevated lapse risk, the participant will be randomized to no intervention, a generic risk alert, or 1 of 4 theory-driven interventions (ie, enhanced education, building self-efficacy, fostering motivation, and improving self-regulation). The primary outcome will be the occurrence of lapse in the 2.5 hours following randomization. Contextual moderators of intervention efficacy will also be explored (eg, location and time of day). The data will inform an optimized JITAI that selects the theory-driven approach most likely to prevent lapses in a given moment. Results The recruitment for the microrandomized trial began on April 19, 2021, and is ongoing. Conclusions This study will optimize a JITAI for dietary lapses so that it empirically tailors the provision of evidence-based intervention to the individual and context. The finalized JITAI will be evaluated for efficacy in a future randomized controlled trial of distal health outcomes (eg, weight loss). Trial Registration ClinicalTrials.gov NCT04784585; http://clinicaltrials.gov/ct2/show/NCT04784585 International Registered Report Identifier (IRRID) DERR1-10.2196/33568
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.