2022
DOI: 10.1371/journal.pone.0262036
|View full text |Cite
|
Sign up to set email alerts
|

Improving usability benchmarking for the eHealth domain: The development of the eHealth UsaBility Benchmarking instrument (HUBBI)

Abstract: Background Currently, most usability benchmarking tools used within the eHealth domain are based on re-classifications of old usability frameworks or generic usability surveys. This makes them outdated and not well suited for the eHealth domain. Recently, a new ontology of usability factors was developed for the eHealth domain. It consists of eight categories: Basic System Performance (BSP), Task-Technology Fit (TTF), Accessibility (ACC), Interface Design (ID), Navigation & Structure (NS), Information &… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 18 publications
0
4
0
Order By: Relevance
“…Additionally, a quantitative data synthesis (ie, meta-analysis) was not feasible because the included studies were too dissimilar in terms of the intervention content, duration, assessment of outcome measures, follow-up, and comparator groups. Future articles should consider a standard method of assessing outcome measures (possibly a combination of both subjective methods and objective methods), such as using trackers for activity levels, the Behavioral Regulation in Exercise Questionnaire for covering knowledge and motivation levels [62], and the eHealth Usability Benchmarking Instrument for measuring usability (retention and acceptability) [63]. These would make the results more homogeneous for meta-analyses.…”
Section: Limitations and Recommendationsmentioning
confidence: 99%
“…Additionally, a quantitative data synthesis (ie, meta-analysis) was not feasible because the included studies were too dissimilar in terms of the intervention content, duration, assessment of outcome measures, follow-up, and comparator groups. Future articles should consider a standard method of assessing outcome measures (possibly a combination of both subjective methods and objective methods), such as using trackers for activity levels, the Behavioral Regulation in Exercise Questionnaire for covering knowledge and motivation levels [62], and the eHealth Usability Benchmarking Instrument for measuring usability (retention and acceptability) [63]. These would make the results more homogeneous for meta-analyses.…”
Section: Limitations and Recommendationsmentioning
confidence: 99%
“…For better visualization of SUS result, e-health criteria were adopted [ 28 ]. Items 1 and 3 were classified as “Usability and Performance”; 2 and 4 were as “Guidance and Support”; 5 and 6 as “Navigation and Interface”; 7 and 8 as “Information and Structure”; and 9 and 10 as “Satisfaction”.…”
Section: Resultsmentioning
confidence: 99%
“…From the patients’ perspective, it is not enough that an eHealth solution is easy or pleasant to use if it does not support their health-related goals, feel meaningful, and fit their real-life situations and daily care activities. Special attention needs to be paid to ensure that these PX-related considerations are included in user-based evaluations of eHealth solutions, as generic usability questionnaires, classifications, and frameworks do not adequately capture these aspects [ 33 , 34 ]. As our study shows, traditional usability tests, complemented with questionnaire and related interview methods, can serve as a meaningful methodological approach for collecting information about PX-related aspects of eHealth solutions.…”
Section: Discussionmentioning
confidence: 99%