BackgroundThe Internet, with its capacity to provide information that transcends time and space barriers, continues to transform how people find and apply information to their own lives. With the current explosion in electronic sources of health information, including thousands of websites and hundreds of mobile phone health apps, electronic health literacy is gaining an increasing prominence in health and medical research. An important dimension of electronic health literacy is the ability to appraise the quality of information that will facilitate everyday health care decisions. Health information seekers explore their care options by gathering information from health websites, blogs, Web-based forums, social networking websites, and advertisements, despite the fact that information quality on the Internet varies greatly. Nonetheless, research has lagged behind in establishing multidimensional instruments, in part due to the evolving construct of health literacy itself.ObjectiveThe purpose of this study was to examine psychometric properties of a new electronic health literacy (ehealth literacy) measure in a national sample of Internet users with specific attention to older users. Our paper is motivated by the fact that ehealth literacy is an underinvestigated area of inquiry.MethodsOur sample was drawn from a panel of more than 55,000 participants maintained by Knowledge Networks, the largest national probability-based research panel for Web-based surveys. We examined the factor structure of a 19-item electronic Health Literacy Scale (e-HLS) through exploratory factor analysis (EFA) and confirmatory factor analysis, internal consistency reliability, and construct validity on sample of adults (n=710) and a subsample of older adults (n=194). The AMOS graphics program 21.0 was used to construct a measurement model, linking latent factors obtained from EFA with 19 indicators to determine whether this factor structure achieved a good fit with our entire sample and the subsample (age ≥ 60 years). Linear regression analyses were performed in separate models to examine: (1) the construct validity of the e-HLS and (2) its association with respondents’ demographic characteristics and health variables.ResultsThe EFA produced a 3-factor solution: communication (2 items), trust (4 items), and action (13 items). The 3-factor structure of the e-HLS was found to be invariant for the subsample. Fit indices obtained were as follows: full sample: χ2 (710)=698.547, df=131, P<.001, comparative fit index (CFI)=0.94, normed fit index (NFI)=0.92, root mean squared error of approximation (RMSEA)=0.08; and for the older subsample (age ≥ 60 years): χ2 (194)=275.744, df=131, P<.001, CFI=0.95, NFI=0.90, RMSEA=0.08.ConclusionsThe analyses supported the e-HLS validity and internal reliability for the full sample and subsample. The overwhelming majority of our respondents reported a great deal of confidence in their ability to appraise the quality of information obtained from the Internet, yet less than half reported performing quality check...
Reporting healthcare quality has become an important factor in healthcare delivery. Prior research has shown that patient-consumers do not frequently use information on websites reporting physician quality to guide their choice of physicians. Our aim is to understand the contextual and personal characteristics that influence patient-consumers’ decisions to trust or ignore information sources about healthcare quality. We use data from Finding Quality Doctors: How Americans Evaluate Provider Quality in the US, 2014, to examine factors that explain trust in sources reporting healthcare quality provided by physicians. Using factor analysis, 3 overarching information sources were identified: (1) employers and healthcare providers; (2) user advocacy sources; and (3) insurance companies and government. We use multiple regression analysis to understand the factors that impact trust in these 3 information sources. Our study found that contrary to previous findings, health status was not a significant factor that affects trust in sources reporting care quality data. Also, age was the only factor that significantly correlated with trusting information from all 3 sources. Specifically, younger adults trusted information from all sources compared to older adults. Furthermore, political affiliation, employment status, income, and area of residence correlated with trusting care quality information from either companies and government agencies or family and social network sources. Results suggest that individual and contextual characteristics are significant factors in trusting information sources regardless of health status and these should be taken into consideration by those promoting public reporting of healthcare quality information.
To our knowledge, we were the first to examine the effect of type of oral solid medication packaging on medication preparation time in nursing homes. Type of packaging matters. The time saved using strip packaging (vs bingo card packaging) has implications for quality of care and the movement towards person-centred care in the nursing home sector. Nurses (or other staff tasked with medication preparation) in nursing homes using strip packaging potentially have more time to devote to nurturing a relationship with the resident. However, time saved in medication preparation by strip packaging is counterproductive if a serious error results. Thus, future studies should investigate the effects of type of packaging on medication preparation errors.
Online educators rely on asynchronous discussions to satisfy the bulk of student-student engagement that is lost when transferring from a face-to-face to an online format. However, not all discussion platforms are created equal and some specialized tools may offer advantages over standard tools embedded in more comprehensive learning management systems (LMS). In this study, we compare two online discussion platforms – one is the native discussion tool embedded within the Canvas LMS and the other is a specialized discussion platform, Packback, which supports students and instructors with gamified elements and artificial intelligence. Specifically, we assess whether post quality differs across these platforms, as measured by average word count, cited sources and weekly participation rate. The initial results are mixed and differ by course, leading us to conclude that the impact of platform on engagement and quality likely depends on the course content, instructor and protocol.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.