2022
DOI: 10.1145/3555130
|View full text |Cite
|
Sign up to set email alerts
|

'Transparency is Meant for Control' and Vice Versa: Learning from Co-designing and Evaluating Algorithmic News Recommenders

Abstract: Algorithmic systems that recommend content often lack transparency about how they come to their suggestions. One area in which recommender systems are increasingly prevalent is online news distribution. In this paper, we explore how a lack of transparency of (news) recommenders can be tackled by involving users in the design of interface elements. In the context of automated decision-making, legislative frameworks such as the GDPR in Europe introduce a specific conception of transparency, granting 'data subjec… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 14 publications
(3 citation statements)
references
References 29 publications
0
3
0
Order By: Relevance
“…We believe current work on training, adapting, and building applications around LLMs can take valuable lessons from these lines of research. More recent HCI studies on algorithmic transparency also highlight that providing transparency without supporting control leaves users frustrated, while effective, efficient, and satisfying control cannot be achieved without transparency (Smith-Renner et al, 2020;Storms et al, 2022).…”
Section: Transparency and Control Often Go Hand-in-handmentioning
confidence: 99%
“…We believe current work on training, adapting, and building applications around LLMs can take valuable lessons from these lines of research. More recent HCI studies on algorithmic transparency also highlight that providing transparency without supporting control leaves users frustrated, while effective, efficient, and satisfying control cannot be achieved without transparency (Smith-Renner et al, 2020;Storms et al, 2022).…”
Section: Transparency and Control Often Go Hand-in-handmentioning
confidence: 99%
“…We believe current work on training, adapting, and building applications around LLMs can take valuable lessons from this line of research. More recent HCI studies on algorithmic transparency have highlighted that providing transparency without supporting control leaves users frustrated, while effective, efficient, and satisfying control cannot be achieved without transparency [164,167]. More critically, scholars have called out the risk of algorithmic transparency without paths for actionability and contestability as creating a false sense of responsibility and user agency [5,98].…”
Section: Transparency and Control Often Go Hand-in-handmentioning
confidence: 99%
“…in terms of making their suffering equally visible via AI curation) and dealing with the problem of representation (e.g. in terms of filtering out and removing information distorting historical facts), it also makes the functionality of these systems less transparent, thus limiting the user control over the system [52].…”
Section: Rohingya Genocide)mentioning
confidence: 99%