2019
DOI: 10.15171/ijhpm.2019.08
|View full text |Cite
|
Sign up to set email alerts
|

Framing Bias in the Interpretation of Quality Improvement Data: Evidence From an Experiment

Abstract: Background: A growing body of public management literature sheds light on potential shortcomings to quality improvement (QI) and performance management efforts. These challenges stem from heuristics individuals use when interpreting data. Evidence from studies of citizens suggests that individuals’ evaluation of data is influenced by the linguistic framing or context of that information and may bias the way they use such information for decision-making. This study extends prospect theory into the field of publ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
10
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(10 citation statements)
references
References 40 publications
0
10
0
Order By: Relevance
“…There is little evidence that the considerable effort and resources invested in quality registrations, including those used to benchmark healthcare providers, and incentives lead to improved health outcomes. 4,[10][11][12] Moreover, quality indicators, as seemingly objective data, may be prone to validity and reliability issues. As these data are often collected by healthcare professionals and managers, they are subject to interpretation issues and even gaming to improve institutions' results.…”
Section: Key Messagesmentioning
confidence: 99%
“…There is little evidence that the considerable effort and resources invested in quality registrations, including those used to benchmark healthcare providers, and incentives lead to improved health outcomes. 4,[10][11][12] Moreover, quality indicators, as seemingly objective data, may be prone to validity and reliability issues. As these data are often collected by healthcare professionals and managers, they are subject to interpretation issues and even gaming to improve institutions' results.…”
Section: Key Messagesmentioning
confidence: 99%
“…Although a criticism of all health worker motivation studies, quantitative data were self-reported and may be subject to acceptability biases. QI data have been found previously to be influenced by whether questions were framed positively or negatively ( Ballard, 2019 ), yet although such biases would overestimate cross-sectional estimates of motivation, these would likely cancel out in our analyses over time or by QI or comparison area, assuming biases were not affected by the programme. Additionally, data were not available on potentially important contextual factors which may have enhanced or inhibited the impact of the QI intervention, e.g.…”
Section: Discussionmentioning
confidence: 89%
“…This challenge is not unique to our study or its design, however, and it is impossible to know how or if such differences in interpretation would impact the results. [17][18] One unique confounder to our study that could have affected both the response rate and potential interpretation of the results was the timing of survey administration. This survey was completed during the early stages of the COVID-19 pandemic in April 2020 to May 2020, just as national and local lockdowns were in effect and many educational institutions switched to virtual learning.…”
Section: Discussionmentioning
confidence: 99%