Background. Controlling costs and achieving health care quality improvements require the participation of activated and informed consumers and patients. Objectives. We describe a process for conceptualizing and operationalizing what it means to be ''activated'' and delineate the process we used to develop a measure for assessing ''activation,'' and the psychometric properties of that measure. Methods. We used the convergence of the findings from a national expert consensus panel and patient focus groups to define the concept and identify the domains of activation. These domains were operationalized by constructing a large item pool. Items were pilot-tested and initial psychometric analysis performed using Rasch methodology. The third stage refined and extended the measure. The fourth stage used a national probability sample to assess the measure's psychometric performance overall and within different subpopulations. Study Sample. Convenience samples of patients with and without chronic illness, and a national probability sample (N 5 1,515) are included at different stages in the research. Conclusions. The Patient Activation Measure is a valid, highly reliable, unidimensional, probabilistic Guttman-like scale that reflects a developmental model of activation. Activation appears to involve four stages: (1) believing the patient role is important, (2) having the confidence and knowledge necessary to take action, (3) actually taking action to maintain and improve one's health, and (4) staying the course even under stress. The measure has good psychometric properties indicating that it can be used at the individual patient level to tailor intervention and assess changes.
Objective. The Patient Activation Measure (PAM) is a 22-item measure that assesses patient knowledge, skill, and confidence for self-management. The measure was developed using Rasch analyses and is an interval level, unidimensional, Guttman-like measure. The current analysis is aimed at reducing the number of items in the measure while maintaining adequate precision. Study Methods. We relied on an iterative use of Rasch analysis to identify items that could be eliminated without loss of significant precision and reliability. With each item deletion, the item scale locations were recalibrated and the person reliability evaluated to check if and how much of a decline in precision of measurement resulted from the deletion of the item. Data Sources. The data used in the analysis were the same data used in the development of the original 22-item measure. These data were collected in 2003 via a telephone survey of 1,515 randomly selected adults. Principal Findings. The analysis yielded a 13-item measure that has psychometric properties similar to the original 22-item version. The scores for the 13-item measure range in value from 38.6 to 53.0 (on a theoretical 0-100 point scale). The range of values is essentially unchanged from the original 22-item version. Subgroup analysis suggests that there is a slight loss of precision with some subgroups. Conclusions. The results of the analysis indicate that the shortened 13-item version is both reliable and valid.Key Words. Patient activation, self-management, consumer roles in health care A previous publication described the development and testing of the Patient Activation Measure (PAM), which assesses patient self-reported knowledge, skill, and confidence for self-management of one's health or chronic condition (Hibbard et al. 2004). The 22-item PAM was developed using Rasch psychometric methods and is an interval level, unidimensional, Guttman-like measure. This current analysis is aimed at reducing the number of items in the measure without significant loss of precision. The methodology used to create the short form (PAM-13) is described and the psychometric properties of the short-form PAM are compared with those of the original 22-item PAM. Finally, the potential clinical and research applications of the short form measure are discussed.
This study evaluates the impact on quality improvement of reporting hospital performance publicly versus privately back to the hospital. Making performance information public appears to stimulate quality improvement activities in areas where performance is reported to be low. The findings from this Wisconsin-based study indicate that there is added value to making this information public. P u b l i c r e p o rt i n g o f h e a lt h c a r e p e r f o r m anc e has grown substantially in recent years, and considerable resources are spent on quality measurement and reporting. Yet it is unclear what impact, if any, these activities have had on quality improvement. Further, the relative impact of reporting for consumer choice (public reporting) versus reporting for internal consumption (private reporting) on providers' motivation to improve has not been examined.Most proponents of the public release of health care performance information believe that making this information public will increase health care providers' motivation to improve. Motivation is thought to be driven by a desire to protect or enhance public reputation or market share, or both. Simply knowing that performance is inadequate may not be sufficiently motivating.Existing evidence on the efficacy of publicly reporting to stimulate improvements is mixed.1 Some studies have found that hospital mortality is reduced following the release of performance data; other studies report no effect of public releases.2 Very few studies have looked at the impact of public reporting on subsequent quality improvement efforts. 3 The strength of the research designs and the quality of the reporting efforts may contribute to the mixed findings. Almost no evaluations of the impacts of public performance reports have used controlled experimental designs. Most studies assess performance before the public release of information and again after the release. There is also a great deal of variation in how well the reports are designed and
This study builds on earlier work by assessing the long-term impact of a public hospital performance report on both consumers and hospitals. In doing so, we shed light on the relative importance of alternative assumptions about what stimulates quality improvements. The findings indicate that making performance data public results in improvements in the clinical area reported upon. An earlier investigation indicated that hospitals included in the public report believed that the report would affect their public image. Indeed, consumer surveys suggest that inclusion did affect hospitals' reputations.
Quantitative mixed models were used to examine literature published from 1966 through 2016 on the effectiveness of Direct Instruction. Analyses were based on 328 studies involving 413 study designs and almost 4,000 effects. Results are reported for the total set and subareas regarding reading, math, language, spelling, and multiple or other academic subjects; ability measures; affective outcomes; teacher and parent views; and single-subject designs. All of the estimated effects were positive and all were statistically significant except results from metaregressions involving affective outcomes. Characteristics of the publications, methodology, and sample were not systematically related to effect estimates. Effects showed little decline during maintenance, and effects for academic subjects were greater when students had more exposure to the programs. Estimated effects were educationally significant, moderate to large when using the traditional psychological benchmarks, and similar in magnitude to effect sizes that reflect performance gaps between more and less advantaged students.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.