1982
DOI: 10.2307/2634327
|View full text |Cite
|
Sign up to set email alerts
|

The Social Control of Technology

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 15 publications
(17 citation statements)
references
References 0 publications
0
9
0
Order By: Relevance
“…However, the assessment of the societal impact of a technology in general, and the assessment of AI in particular, is a typical case of the Collingridge (1982): These are developments that are either difficult to predict if they do not exist, or difficult to manage and regulate if they are already ubiquitous. On the one hand, if the technology is sufficiently developed and available, it can be well evaluated, but by then it is often too late to regulate the development.…”
Section: Studies On Human Perception Of Aimentioning
confidence: 99%
“…However, the assessment of the societal impact of a technology in general, and the assessment of AI in particular, is a typical case of the Collingridge (1982): These are developments that are either difficult to predict if they do not exist, or difficult to manage and regulate if they are already ubiquitous. On the one hand, if the technology is sufficiently developed and available, it can be well evaluated, but by then it is often too late to regulate the development.…”
Section: Studies On Human Perception Of Aimentioning
confidence: 99%
“…Policy studies scholars generally, including information policy researchers, emphasize this need for policy makers at all levels of government to be aware of and responsive to multiple values, stakeholders, and pro‐social responsibilities (e.g., Auer, 2006; Bacchi, 2000; Braman, 2006; Nagel, 1990; Overman & Cahill, 1990; Parsons, 1999; Reidenberg, 1997; Rowlands, 1996; Schön & Rein, 1994; Stone, 2002; White, 1994). Considering this large set of values against technological innovation can be particularly challenging, as the scholar David Collingridge (1980) first identified (as quoted in Hageman et al, p. 20, note 103):
The Collingridge dilemma refers to the difficulty of putting the proverbial genie back in the bottle once a given technology has reached a certain inflection point. … Such inflection points represent the moment when a particular technology achieves critical mass in terms of adoption or, more generally, the time when that technology begins to profoundly transform the way individuals and institutions act … “The social consequences of a technology cannot be predicted early in the life of the technology,” Collingridge claimed … “By the time undesirable consequences are discovered, however, the technology is often so much part of the whole economics and social fabric that its control is extremely difficult” … Collingridge referred to this as the “dilemma of control,” noting that: “When change is easy, the need for it cannot be foreseen; when the need for change is apparent, change has become expensive, difficult and time‐consuming.”
…”
Section: The Value Of Deliberative Government Actionmentioning
confidence: 99%
“…And, fourth, network and coordination effects mean that advantages accrue to agents adopting the same technology as others. A good example of such lock-in is provided in Collingridge's (1982) discussion of roads and automobiles, which shows how commitment to certain technologies can, in the longer-term, create lock-in as communities of practice and new technologies build up the installed base.…”
Section: Technological Innovation In Border Controlmentioning
confidence: 99%