New infographic based on “The Precipice”. any feedback?
(edited based on feedback)
I’ve been reading Toby Ord’s thoughtful book, The Precipice. In it, he estimates there’s a 1 in 6 chance of humanity not surviving the next century. He has a table that summarizes the main sources of risk as he sees them that I thought could be more sharable and comparable if it were put into a diagram. So I made the following diagram in Illustrator (here’s the file if you want to play with it). If this seems like a useful thing to do, I’d love to get feedback on how it can be improved:
Here’s another version, that I set up more like minesweeper with 100 possible futures:
There is a lot of research and data that went into Toby’s estimates, so the downside of making a diagram that’s less accurate since it will not be wrapped in all of the context from the book, but the upside is that it’s potentially more sharable. (this seems to be an unanswered question) But it could still work as a good conversation starter for those thinking about cause prioritization and longtermism.
- Humanities Research Ideas for Longtermists by 9 Jun 2021 4:39 UTC; 151 points) (
- What are some artworks relevant to EA? by 17 Jan 2022 1:54 UTC; 51 points) (
- Human survival is a policy choice by 3 Jun 2022 18:53 UTC; 27 points) (
- EA Updates for February 2021 by 29 Jan 2021 14:17 UTC; 23 points) (
- Silly idea to enhance List representation accuracy by 24 Apr 2023 0:30 UTC; 7 points) (
I suggest making clearer that these are one researcher’s rough estimates. Otherwise I think it gives a false sense of precision. Maybe by titling the infographic “Rough guess at global catastrophic risks from The Precipice” or similar.
Just speaking impressionistically, the minesweeper version looks much more “hazardous” than the first version.
There’s something about the neatly arranged units that seems very manageable.
Obviously there’s just some biases at play here, but something to be conscious of depending on the use case
Oh, and great work!
This looks great! And I agree with Aslan that the minesweeper edition feels very different and I am glad you created it.
One note: existential risks are a distinct concept to both extinction risks and global catastrophic risks. Table 6.1 in Toby’s book describes existential risks which is what you are depicting here—existential risks include extinction risk but also the risk that humanity will turn into a permanent dystopia as well as permanent civilisational collapse (but humanity lives on).
Global catastrophic risks are different again: they are risks that kill at least 10% of the human population.
Thanks for the tip! I’ll change to existential risks