2018
DOI: 10.14236/ewic/hci2018.141
|View full text |Cite
|
Sign up to set email alerts
|

Can Video-based Qualitative Analysis Help Us Understand User-algorithm Interaction?

Abstract: There is growing debate in contemporary life over the roles played by algorithms when we browse online. In particular, concerns are raised that algorithmic processes to index, filter and personalise content can 'manipulate' user behaviours in ways that lead to detrimental outcomes both online and offline. This short paper reports on ongoing work to examine how users interact with algorithms when undertaking browsing tasks online. Drawing on insights from ethnomethodology and conversation analysis, this video-b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 9 publications
0
2
0
Order By: Relevance
“…After completing the pre-study questionnaire, participants were directed to our prototype e-recruitment system. The implemented system called Algorithm Playground 12 offers participants a look behind the scenes of a presumed erecruitment system and provides textual explanations on how job applicants are ranked by the algorithms.…”
Section: It Professionalmentioning
confidence: 99%
See 1 more Smart Citation
“…After completing the pre-study questionnaire, participants were directed to our prototype e-recruitment system. The implemented system called Algorithm Playground 12 offers participants a look behind the scenes of a presumed erecruitment system and provides textual explanations on how job applicants are ranked by the algorithms.…”
Section: It Professionalmentioning
confidence: 99%
“…Herlocker et al [3] noted that many recommender systems lack transparency in terms of the recommendation process and result generation. Webb and Patel [12] observed that algorithmic processes that filter and personalise the content seen by users may lead to detrimental outcomes such as reinforcement of societal biases and gender or ethnic discrimination among others. Wang et al [14] reported that people rate an algorithm as more fair when the algorithm predicts in their favor, even compensating for the negative effects of algorithms that are biased against particular demographic groups.…”
mentioning
confidence: 99%