2018
DOI: 10.1016/j.futures.2018.01.003
|View full text |Cite
|
Sign up to set email alerts
|

Global catastrophic and existential risks communication scale

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
15
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
10

Relationship

1
9

Authors

Journals

citations
Cited by 35 publications
(15 citation statements)
references
References 31 publications
0
15
0
Order By: Relevance
“…If this argument is correct (Haussler, 2016), it would also apply to other civilizations and it appears reasonable to assume that population growth is intertwined with technological and philosophical progress. Indeed, the risk of catastrophic extinction has been estimated of order 0.2 % per year (Matheny, 2007; Simpson, 2016; Turchin and Denkenberger, 2018) at our current technological level and might be applicable to other civilizations within less than a few orders of magnitude (Gerig, 2012; Gerig et al ., 2013). The possession of ever more powerful technology in the hands of (many) individuals is increasingly dangerous (Cooper, 2013).…”
Section: Discussionmentioning
confidence: 99%
“…If this argument is correct (Haussler, 2016), it would also apply to other civilizations and it appears reasonable to assume that population growth is intertwined with technological and philosophical progress. Indeed, the risk of catastrophic extinction has been estimated of order 0.2 % per year (Matheny, 2007; Simpson, 2016; Turchin and Denkenberger, 2018) at our current technological level and might be applicable to other civilizations within less than a few orders of magnitude (Gerig, 2012; Gerig et al ., 2013). The possession of ever more powerful technology in the hands of (many) individuals is increasingly dangerous (Cooper, 2013).…”
Section: Discussionmentioning
confidence: 99%
“…AI could also be used to intentionally or inadvertently develop a pathogen with a 100% case‐fatality rate, paralyze humanity under a totalitarian regime, or send an annihilating command to a robot army. Some have argued that AI is the most salient existential risk (Turchin & Denkenberger, 2018b) with a probability orders of magnitude greater than natural risks or nuclear winter (Ord, 2020). It is surprising then to find that the UN Digital Library does not appear to reflect this.…”
Section: Existential Risks and International Governance: The Un Digitmentioning
confidence: 99%
“…Global warming is the immediate threat, together with its subthemes of sea-level rise, pestilence, famine and conflict. Yet there are many other ways our civilization could fail, from declining fertility rates to pandemics to replacement of humans by AI to simple urban decay (Baum, 2015; Avin et al ., 2018; Turchin and Denkenberger, 2018). While these threats individually might not be so severe as to cause human extinction, collectively they might conceivably do so (Kareiva and Carranza, 2018).…”
Section: Introductionmentioning
confidence: 99%