2003
DOI: 10.2196/jmir.5.4.e30
|View full text |Cite
|
Sign up to set email alerts
|

Design and Testing of a Tool for Evaluating the Quality of Diabetes Consumer-Information Web Sites

Abstract: BackgroundMost existing tools for measuring the quality of Internet health information focus almost exclusively on structural criteria or other proxies for quality information rather than evaluating actual accuracy and comprehensiveness.ObjectiveThis research sought to develop a new performance-measurement tool for evaluating the quality of Internet health information, test the validity and reliability of the tool, and assess the variability in diabetes Web site quality.MethodsAn objective, systematic tool was… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
33
0

Year Published

2004
2004
2023
2023

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 38 publications
(33 citation statements)
references
References 12 publications
0
33
0
Order By: Relevance
“…Inconsistencies between the outcomes of diagnostic tests run by different laboratories on the same sample using the same test have been shown to be partly due to subjectivity of the diagnostic test evaluations experienced by independent raters (see for example McClure et al 2005). Training and experience would likely improve inter-rater reliability (Seidman et al 2003, Sevatdal 2005. The inter-rater reliability of the current bioassay protocol was evaluated by comparing 2 independent rater's evaluations of sea lice responsiveness to EMB using the CCC.…”
Section: Rater Agreementmentioning
confidence: 99%
“…Inconsistencies between the outcomes of diagnostic tests run by different laboratories on the same sample using the same test have been shown to be partly due to subjectivity of the diagnostic test evaluations experienced by independent raters (see for example McClure et al 2005). Training and experience would likely improve inter-rater reliability (Seidman et al 2003, Sevatdal 2005. The inter-rater reliability of the current bioassay protocol was evaluated by comparing 2 independent rater's evaluations of sea lice responsiveness to EMB using the CCC.…”
Section: Rater Agreementmentioning
confidence: 99%
“…13 Kim et al 5 reviewed 29 published rating tools and identified several key criteria for the evaluation of websites (see Table 1). Self-policing approaches, such as that established by Health on the Net (HON), enable websites to exhibit the HON code if they conform with a set of principles similar to those identified by Kim and colleagues.…”
Section: Evaluation Of Websitesmentioning
confidence: 99%
“…Similar to the Seidman diabetes website tool, which demonstrated moderate-to-high reliability, the modified HSDD website tool demonstrated moderate reliability (Seidman, Steinwachs, & Rubin, 2003). In contrast to the Seidman study which used an interrater methodology, this study used an intrarater methodology to assess reliability.…”
Section: Discussionmentioning
confidence: 99%
“…Indeed, the correlation between such criteria and the scientific reliability of Internet health information has been questioned (Ekman, Hall, & Litton, 2005). Seidman and colleagues have emphasized the need for tools with criteria that directly assess the scientific validity of health information present on a website (Seidman, Steinwachs, & Rubin, 2003). Rather than just assessing structural criteria, their tool evaluated criteria addressing the actual accuracy and comprehensiveness of diabetes information on websites.…”
mentioning
confidence: 99%