I especially liked Nick’s sapling analogy, and found it fitting. I worry that EAs are drawn from subgroups with a tendency to believe relatively simple formalistic and mechanistic processes essentially describe complex ones, with perhaps a decrease in accuracy (relative to more complex models) but not in the general sign and magnitude of the result. This seems really dangerous.
“Imagine a Level 1 event that disproportionately affected people in areas that are strong in innovative science (of which we believe there are a fairly contained number). Possible consequences of such an event might include a decades-long stall in scientific progress or even an end to scientific culture or institutions and a return to rates of scientific progress comparable to what we see in areas with weaker scientific institutions today or saw in pre-industrial civilization.”
It seems likely that any Level 1 event will have disproportionate effects on certain groups (possibly ones that would be especially useful for bringing civilization back from a level 1 event), and this seems like a pretty under-investigated consideration. A pandemic that was extremely virulent but only contagious enough to spread fully in big cities. Or extreme climate change or geoengineering gone awry knocking out mostly the global north or mostly equatorial regions or coastal regions.
He doesn’t really discuss the possibility of a Level 1 event immediately provoking a Level 2 event, but that also seems possible (for example, one catastrophic use of biowarfare could incentivize another country to develop even more powerful bioweapons, or to develop some sort of militaristic AI for defense. Or catastrophic climate change could cause the use of extreme and ill-tested geoengineering). This actually seems moderately likely, and I wonder why he didn’t discuss it.
Copied from my comment on a Facebook post:
I especially liked Nick’s sapling analogy, and found it fitting. I worry that EAs are drawn from subgroups with a tendency to believe relatively simple formalistic and mechanistic processes essentially describe complex ones, with perhaps a decrease in accuracy (relative to more complex models) but not in the general sign and magnitude of the result. This seems really dangerous.
“Imagine a Level 1 event that disproportionately affected people in areas that are strong in innovative science (of which we believe there are a fairly contained number). Possible consequences of such an event might include a decades-long stall in scientific progress or even an end to scientific culture or institutions and a return to rates of scientific progress comparable to what we see in areas with weaker scientific institutions today or saw in pre-industrial civilization.” It seems likely that any Level 1 event will have disproportionate effects on certain groups (possibly ones that would be especially useful for bringing civilization back from a level 1 event), and this seems like a pretty under-investigated consideration. A pandemic that was extremely virulent but only contagious enough to spread fully in big cities. Or extreme climate change or geoengineering gone awry knocking out mostly the global north or mostly equatorial regions or coastal regions.
He doesn’t really discuss the possibility of a Level 1 event immediately provoking a Level 2 event, but that also seems possible (for example, one catastrophic use of biowarfare could incentivize another country to develop even more powerful bioweapons, or to develop some sort of militaristic AI for defense. Or catastrophic climate change could cause the use of extreme and ill-tested geoengineering). This actually seems moderately likely, and I wonder why he didn’t discuss it.