âIf we drop the baton, succumbing to an existential catastrophe, we would fail our ancestors in a multitude of ways. We would fail to achieve the dreams they hoped for; we would betray the trust they placed in us, their heirs; and we would fail in any duty we had to pay forward the work they did for us. To neglect existential risk might thus be to wrong not only the people of the future, but the people of the past.â
- Toby Ord
We are capable of helping to build a better future for trillions of people. But the loss of human civilization could obliterate that potential.
If we want to do as much good as we can, and create better lives for our descendants, we should consider ways we could destroy ourselves â and figure out how to stop them from happening.
In this sequence, weâll define âexistential riskâ, explore strategies for addressing it, and examine why this work might be both important and neglected.
Two ways to read
There are two ways to get started, depending on whether you have access to Toby Ordâs The Precipice â and if you donât, weâll send you a free copy!
First option (no book): Read the sequence as written (click on âStart readingâ).
Second option (book): Read chapters 2 and 4 of The Precipice, as well as 80,000 Hoursâ âPolicy and research ideas to reduce existential riskâ.
Start reading
<â Part 4: Longtermism
â> Part 6: Emerging Technologies
Organization Spotlight: Future of Humanity Institute |
The Future of Humanity Institute (FHI) is a multidisciplinary research institute working on big picture questions for human civilisation and exploring what can be done now to ensure a flourishing long-term future. Currently, their four main research areas are:
|
Photo credit: Tim RĂźĂmann