2013
DOI: 10.14236/ewic/hci2013.49
|View full text |Cite
|
Sign up to set email alerts
|

Are users more diverse than designs? Testing and extending a 25 years old claim

Abstract: Twenty-five years ago, Dennis Egan published a review on the impact of individual differences in human-computer interaction, where he claimed that users are more diverse than designs are [5]. While being cited frequently, this claim has not been tested since then. An efficient research design for separating and comparing variance components is presented, together with a statistical model to test Egan's claim. The results of a pilot study indicate that Egan's claim does not universally hold. An extension to the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2015
2015
2020
2020

Publication Types

Select...
2
1
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 11 publications
0
4
0
Order By: Relevance
“…For one, it can inform our scientific understanding of human-computer interaction and developing more inclusive models of human behavior toward devices. Furthermore, it is relevant for those interested in designing systems that can accommodate as many users as possible (cf., [3], [7]).…”
Section: Introductionmentioning
confidence: 99%
“…For one, it can inform our scientific understanding of human-computer interaction and developing more inclusive models of human behavior toward devices. Furthermore, it is relevant for those interested in designing systems that can accommodate as many users as possible (cf., [3], [7]).…”
Section: Introductionmentioning
confidence: 99%
“…For example, a crucial question in validation testing could be how uniformly users benefit from a novel design, which is different to just regarding the average benefit. Furthermore, tasks can be considered samples, too, and be modelled as random effects [28]. Incorporating tasks-level random effects allows assessing whether any observed advantage of a novel design is uniform across tasks.…”
Section: Mixed-effects Linear Modelsmentioning
confidence: 99%
“…Egan calls designs that reduce the probability of extreme low performance level robust designs [47]. Random effects analysis is a suitable method to assess inter-individual variation and can be used to assess the robustness of design [28]. This is even possible without the necessity of theoretically inferred predictors to explain variation, as the individual trajectories speak for themselves.…”
Section: Interpreting Longitudinal Usability Measuresmentioning
confidence: 99%
See 1 more Smart Citation