IMPORTANCECancer treatment delay has been reported to variably impact cancer-specific survival and coronavirus disease 2019 (COVID-19)-specific mortality during the severe acute respiratory syndrome coronavirus 2 pandemic. During the pandemic, treatment delay is being recommended in a nonquantitative, nonobjective, and nonpersonalized manner, and this approach may be associated with suboptimal outcomes. Quantitative integration of cancer mortality estimates and data on the consequences of treatment delay is needed to aid treatment decisions and improve patient outcomes.OBJECTIVE To obtain quantitative integration of cancer-specific and COVID-19-specific mortality estimates that can be used to make optimal decisions for individual patients and optimize resource allocation. DESIGN, SETTING, AND PARTICIPANTSIn this decision analytical model, age-specific and stage-specific estimates of overall survival pre-COVID-19 were adjusted by the probability of COVID-19 (individualized by county, treatment-specific variables, hospital exposure frequency, and COVID-19 infectivity estimates), COVID-19 mortality (individualized by age-specific, comorbidity-specific, and treatment-specific variables), and delay of cancer treatment (impact and duration). These model estimates were integrated into a web application (OncCOVID) to calculate estimates of the cumulative overall survival and restricted mean survival time of patients who received immediate vs delayed cancer treatment. Using currently available information about COVID-19, a susceptible-infectedrecovered model that accounted for the increased risk among patients at health care treatment centers was developed. This model integrated the data on cancer mortality and the consequences of treatment delay to aid treatment decisions. Age-specific and cancer stage-specific estimates of overall survival pre-COVID-19 were extracted from the Surveillance, Epidemiology, and End Results database for 691 854 individuals with 25 cancer types who received cancer diagnoses in 2005 to 2006. Data from 5 436 896 individuals in the National Cancer Database were used to estimate the independent impact of treatment delay by cancer type and stage. In addition, data from 275 patients in a nested case-control study were used to estimate the COVID-19 mortality rate by age group and number of comorbidities. Data were analyzed from March 17 to May 21, 2020. EXPOSURES COVID-19 and cancer.MAIN OUTCOMES AND MEASURES Estimates of restricted mean survival time after the receipt of immediate vs delayed cancer treatment.
Stream water quality can be greatly influenced by changes in agricultural practices, but studies of long‐term dynamics are scarce. Here we describe trends over 21 yr (1994–2014) in nutrients and suspended sediments in three streams in a Midwestern US agricultural watershed. During this time, the watershed experienced substantial changes in agricultural practices, most importantly a pronounced shift from conventional to conservation tillage. In the 1990s and early 2000s, NH4, soluble reactive P, and suspended sediment concentrations (standardized for discharge and season) each declined significantly (>4–12% per year) in at least two of the three streams (P < 0.01), whereas NO3 changed relatively little. However, since the early 2000s, declines in NH4 and sediment concentrations have slowed, soluble reactive P concentrations have not declined and may actually have increased, and NO3 concentrations have declined sharply. The more recent lack of decline in soluble reactive P coincides with a plateau in the prevalence of conservation tillage and may be because of increased soil P stratification due to long‐term reduced tillage. The more recent decline in NO3 may be due to improved efficiency of N fertilizer use, increased soil denitrification, and/or declines in atmospheric N deposition. Our study shows that stream concentrations of N, P, and sediment can respond in contrasting ways to changes in agriculture, and that temporal trends can moderate, accelerate, or reverse over decadal timescales. Management strategies must consider contrasting temporal responses of water quality indicators and may need to be adaptively adjusted at scales of years to decades. Core Ideas Stream water quality in a Midwestern watershed responds to agricultural management. Conservation tillage and nutrient management appear to be major drivers of change. Nitrogen, phosphorus, and sediment show contrasting dynamics over decadal timescales. Management plans must anticipate temporally variable trends over multiple decades.
Abnormalities of thyroid function are common in patients with nephrotic syndrome (NS). However, a limited number of studies have reported on the association between clinicopathologic features and thyroid dysfunction in patients with NS. We retrospectively studied 317 patients who had been definitively diagnosed with NS. The NS patients with thyroid dysfunction showed higher urine protein, creatinine and lipid levels and lower albumin and hemoglobin than those with normal thyroid function, with no significant differences of pathological types. After dividing thyroid dysfunction groups into five subgroups, interestingly, membranous nephropathy was the most common pathologic type, both in normal thyroid group and in subclinical hypothyroidism group (40.4% and 46.7%, respectively), followed by minimal change disease (28.1% and 21.7%, respectively); while in the hypothyroid, low T3, and low T3T4 groups minimal change disease is now the leading type (48.8%, 33.3% and 38.6%, respectively). High levels of urinary protein, creatinine, cholesterol, and platelets were independent risk factors predicting thyroid dysfunction, while higher albumin and hemoglobin were protective factors. We demonstrated that the type of renal pathology was different among NS patients in different thyroid dysfunction subgroups. Interpretation of the interactions between thyroid and renal function is a challenge for clinicians involved in the treatment of patients with NS.
Highlights Elevated soluble TNFR1 levels are predictive of liver toxicity among patients receiving radiation. Soluble TNFR1 levels do not independently predict liver toxicity when included in models with ALBI and mean liver dose. Data suggest that liver inflammation mediates toxicity after liver irradiation and that the TNFα axis is associated with this inflammation. Future studies of should evaluate approaches that target pre-treatment inflammation to reduce the risk of toxicity.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.