# Existential risk
|  | A **Global catastrophic risk** or a doomsday scenario is a hypothetical event that could damage human well-being on a global scale, even endangering or destroying modern civilization. An event that could cause human extinction or permanently and drastically curtail humanity's existence or potential is known as an "existential risk." |
|-|-|
| | wikipedia:: [Global catastrophic risk](https://en.wikipedia.org/wiki/Global_catastrophic_risk) |
> [!summary]- Wikipedia Synopsis
> A **Global catastrophic risk** or a doomsday scenario is a hypothetical event that could damage human well-being on a global scale, even endangering or destroying modern civilization. An event that could cause human extinction or permanently and drastically curtail humanity's existence or potential is known as an "existential risk."
> Over the last two decades, a number of academic and non-profit organizations have been established to research global catastrophic and existential risks, formulate potential mitigation measures and either advocate for or implement these measures.
[[Futures Studies]]
## [[Nick Bostrom]]
## [[Centre for the Study of Existential Risk]]
## [[Existential risk from artificial general intelligence]]
[[Climate Change]]
Nuclear war
## Inbox
- [[Information Hazard]]