Purpose: Psychosocial variables are known risk factors for the development and chronification of low back pain (LBP). Psychosocial stress is one of these risk factors. Therefore, this study aims to identify the most important types of stress predicting LBP. Self-efficacy was included as a potential protective factor related to both, stress and pain. Participants and Methods: This prospective observational study assessed n = 1071 subjects with low back pain over 2 years. Psychosocial stress was evaluated in a broad manner using instruments assessing perceived stress, stress experiences in work and social contexts, vital exhaustion and life-event stress. Further, self-efficacy and pain (characteristic pain intensity and disability) were assessed. Using least absolute shrinkage selection operator regression, important predictors of characteristic pain intensity and pain-related disability at 1-year and 2-years follow-up were analyzed. Results: The final sample for the statistic procedure consisted of 588 subjects (age: 39.2 (±13.4) years; baseline pain intensity: 27.8 (±18.4); disability: 14.3 (±17.9)). In the 1-year follow-up, the stress types "tendency to worry", "social isolation", "work discontent" as well as vital exhaustion and negative life events were identified as risk factors for both pain intensity and pain-related disability. Within the 2-years follow-up, Lasso models identified the stress types "tendency to worry", "social isolation", "social conflicts", and "perceived long-term stress" as potential risk factors for both pain intensity and disability. Furthermore, "self-efficacy" ("internality", "self-concept") and "social externality" play a role in reducing pain-related disability. Conclusion: Stress experiences in social and work-related contexts were identified as important risk factors for LBP 1 or 2 years in the future, even in subjects with low initial pain levels. Self-efficacy turned out to be a protective factor for pain development, especially in the long-term follow-up. Results suggest a differentiation of stress types in addressing psychosocial factors in research, prevention and therapy approaches.
Supplemental Digital Content is Available in the Text.Two new screening tools identify increased risk of chronic low back pain and potentially treatment-modifiable prognostic indicators in 4 yellow flag domains.
Low-back pain is a major health problem exacerbated by the fact that most treatments are not suitable for self-management in everyday life. Particularly, interdisciplinary programs consist of intensive therapy lasting several weeks. Additionally, therapy components are rarely coordinated regarding reinforcing effects, which would improve complaints in persons with higher pain. This study assesses the effectiveness of a self-management program, firstly for persons suffering from higher pain and secondly compared to regular routines. Study objectives were treated in a single-blind multicenter controlled trial. A total of n = 439 volunteers (age 18–65 years) were randomly assigned to a twelve-week multidisciplinary sensorimotor training (3-weeks-center- and 9-weeks-homebased) or control group. The primary outcome pain (Chronic-Pain-Grade) as well as mental health were assessed by questionnaires at baseline and follow-up (3/6/12/24 weeks, M2-M5). For statistical analysis, multiple linear regression models were used. N = 291 (age 39.7 ± 12.7 years, female = 61.1%, 77% CPG = 1) completed training (M1/M4/M5), showing a significantly stronger reduction of mental health complaints (anxiety, vital exhaustion) in people with higher than those with lower pain in multidisciplinary treatment. Compared to regular routines, the self-management–multidisciplinary treatment led to a clinically relevant reduction of pain–disability and significant mental health improvements. Low-cost exercise programs may provide enormous relief for therapeutic processes, rehabilitation aftercare, and thus, cost savings for the health system.
Summary
Grafts from elderly donors are increasingly used for liver transplantation. As of yet there is no published systematic data to guide the use of specific age cutoffs the effect of elderly donors on patient outcomes must be clarified. This study analyzed the Eurotransplant database (01/01/2000–31/07/2014; N = 26 294) out of whom 8341 liver transplantations were filtered to identify for this analysis. 2162 of the grafts came from donors >60 including 203 from octogenarians ≥80 years. Primary outcome was the risk of graft failure according to donor age using a confounder adjusted Cox‐Regression model with frailty terms (or random effects). The proportion of elderly grafts increased during the study period [i.e., octogenarians 0.1% (n = 1) in 2000 to 3.4% (n = 45) in 2013]. Kaplan–Meier and Cox‐analyses revealed a reduced survival and a higher risk for graft failure with increasing donor age. Although the age effect was allowed to vary non‐linearly, a linear association hazard ratio (HR = 1.1 for a 10 year increase in donor age) was evident. The linearity of the association suggests that there is no particular age at which the effect increases more rapidly, providing no evidence for a cutoff age. In clinical practice, the combination of high donor age with HU‐transplantations, hepatitis C, high MELD‐scores and long cold ischemic time should be avoided.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.