I recently finished Toby Ordâs The Precipice, and thought it was an excellent and very important book. I plan to write a bunch of posts that summarise, comment on, or take inspiration from various parts of it. Most are currently veryearly-stage, but the working titles are below.
Key uncertainties/âquestions:
Is there anyone whoâs already planning to write similar things? I probably wonât have time to write all the things Iâve planned. So if someone else is already likely to pursue ideas similar to some of these, we could potentially collaborate, or I could share my notes and thoughts, let you take that particular topic from there, and allocate my time to other things.
Working titles:
Defining existential risks and existential catastrophes
My thoughts on Toby Ordâs policy & research recommendations
Existential security
Civilizational collapse and recovery: Toby Ordâs views and my doubts
The Terrible Funnel: Estimating odds of each step on the x-risk causal path (this title is especially âworkingâ)
The idea here would be to adapt something like the âGreat Filterâ or âDrake Equationâ reasoning to estimating the probability of existential catastrophe, using how humanity has fared in prior events that passed or couldâve passed certain âstepsâ on certain causal chains to catastrophe.
E.g., even though weâve never faced a pandemic involving a bioengineered pathogen, perhaps our experience with how many natural pathogens have moved from each âstepâ to the next one can inform what would likely happen if we did face a bioengineered pathogen, or if it did get to a pandemic level.
This idea seems sort of implicit in the Precipice, but isnât really spelled out there. Also, as is probably obvious, I need to do more to organise my thoughts on it myself.
This may include discussion of how Ord distinguishes natural and anthropogenic risks, and why the standard arguments for an upper bound for natural extinction risks donât apply to natural pandemics. Or that might be a separate post.
Developingâbut not deployingâdrastic backup plans
âMacrostrategyâ: Attempted definitions and related concepts
This would relate in part to Ordâs concept of âgrand strategy for humanityâ
Collection of notes
A post summarising the ideas of existential risk factors and existential security factors?
I suspect I wonât end up writing this, but I think someone should. For one thing, itâd be good to have something people can reference/âlink to that explains that idea (sort of like the role EA Concepts serves).
A bunch of posts related to The Precipice
I recently finished Toby Ordâs The Precipice, and thought it was an excellent and very important book. I plan to write a bunch of posts that summarise, comment on, or take inspiration from various parts of it. Most are currently very early-stage, but the working titles are below.
Key uncertainties/âquestions:
Is there anyone whoâs already planning to write similar things? I probably wonât have time to write all the things Iâve planned. So if someone else is already likely to pursue ideas similar to some of these, we could potentially collaborate, or I could share my notes and thoughts, let you take that particular topic from there, and allocate my time to other things.
Working titles:
Defining existential risks and existential catastrophes
My thoughts on Toby Ordâs policy & research recommendations
Existential security
Civilizational collapse and recovery: Toby Ordâs views and my doubts
The Terrible Funnel: Estimating odds of each step on the x-risk causal path (this title is especially âworkingâ)
The idea here would be to adapt something like the âGreat Filterâ or âDrake Equationâ reasoning to estimating the probability of existential catastrophe, using how humanity has fared in prior events that passed or couldâve passed certain âstepsâ on certain causal chains to catastrophe.
E.g., even though weâve never faced a pandemic involving a bioengineered pathogen, perhaps our experience with how many natural pathogens have moved from each âstepâ to the next one can inform what would likely happen if we did face a bioengineered pathogen, or if it did get to a pandemic level.
This idea seems sort of implicit in the Precipice, but isnât really spelled out there. Also, as is probably obvious, I need to do more to organise my thoughts on it myself.
This may include discussion of how Ord distinguishes natural and anthropogenic risks, and why the standard arguments for an upper bound for natural extinction risks donât apply to natural pandemics. Or that might be a separate post.
Developingâbut not deployingâdrastic backup plans
âMacrostrategyâ: Attempted definitions and related concepts
This would relate in part to Ordâs concept of âgrand strategy for humanityâ
Collection of notes
A post summarising the ideas of existential risk factors and existential security factors?
I suspect I wonât end up writing this, but I think someone should. For one thing, itâd be good to have something people can reference/âlink to that explains that idea (sort of like the role EA Concepts serves).