2015
DOI: 10.1007/s10676-015-9366-9
|View full text |Cite
|
Sign up to set email alerts
|

The use of software tools and autonomous bots against vandalism: eroding Wikipedia’s moral order?

Abstract: English-language Wikipedia is constantly being plagued by vandalistic contributions on a massive scale. In order to fight them its volunteer contributors deploy an array of software tools and autonomous bots. After an analysis of their functioning and the 'coactivity' in use between humans and bots, this research 'discloses' the moral issues that emerge from the combined patrolling by humans and bots. Administrators provide the stronger tools only to trusted users, thereby creating a new hierarchical layer. Fu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
1

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 19 publications
0
4
0
Order By: Relevance
“…We are Wikipedia researchers who have been working on various issues around bots and automation for many years, and we were surprised and intrigued to hear these findings. While bot-bot conflict certainly takes place in Wikipedia for a variety of reasons, conflict at the size, scale, and severity the authors of the EGBF paper claimed has not been mentioned in almost a decade of multi-disciplinary scholarship on the governance of Wikipedia bots (and particularly the Bot Approvals Group) [13,19,24,34,43,55,65]. Previous literature has largely celebrated the successes of Wikipedia's approach to automation, which is based on the same principles as editing Wikipedia articles: decentralized consensus-building, scaffolded by formalized policies and processes.…”
Section: Even Good Bots Fight? the Broader Implications Of Wikipedianmentioning
confidence: 99%
See 2 more Smart Citations
“…We are Wikipedia researchers who have been working on various issues around bots and automation for many years, and we were surprised and intrigued to hear these findings. While bot-bot conflict certainly takes place in Wikipedia for a variety of reasons, conflict at the size, scale, and severity the authors of the EGBF paper claimed has not been mentioned in almost a decade of multi-disciplinary scholarship on the governance of Wikipedia bots (and particularly the Bot Approvals Group) [13,19,24,34,43,55,65]. Previous literature has largely celebrated the successes of Wikipedia's approach to automation, which is based on the same principles as editing Wikipedia articles: decentralized consensus-building, scaffolded by formalized policies and processes.…”
Section: Even Good Bots Fight? the Broader Implications Of Wikipedianmentioning
confidence: 99%
“…As bots play a substantial role in Wikipedia, they have been extensively discussed in previous literature [13,24,34,43,55,62,65], even incidentally in the context of other topics [8,19,28,51]. It is important to distinguish between: 1) bot-bot conflict, in which automated software agents get into "edit wars" with each other over the content of articles due to being programmed with opposing directives and 2) conflict about bots, in which human Wikipedians conflict with each other about what kinds of tasks ought to be automated in Wikipedia and how.…”
Section: Bot Governancementioning
confidence: 99%
See 1 more Smart Citation
“…This system of surveillance, carried out by humans, bots, and tools, has been described before and analysed in view of the moral questions that it raises (de Laat 2015). It was found, to begin with, that, although all Wikipedians are invited to watch out for vandalism and revert any instance of the kind, the stronger tools in the counter-vandalism repertoire which allow faster search and correction are only distributed to trusted users.…”
Section: Introductionmentioning
confidence: 99%