I think so! The same goes for bunkers and better facilities for rebuilding from a catastrophe (energy, food, tech for fertility, tech for reindustrialization and literature on these). Same goes for research of mitigation of risks from AI and synthetic pandemics. It seems, as argued in Nick Bekstead’s thesis that these things that have a plausible long-run impact are overwhelmingly important relative to things that don’t. There is some room for argument around the edges about which short-range philanthropic endeavours might have some positive indirect long-run effects but by and large, we have a situation where society has left all of these areas neglected relative to their overwhelming importance. The group of effective altruists and rationalists who are willing to focus on these issues of direct long-run importance is perhaps 200, many of whom are still in school. So we have to prioritize. And that means more than half of effort should go into AI and synthetic pandemics including their policy currently, much of the remainder should go into identifying new technological risks, and bunkers and lunar colonies should have only a couple of percent of our attention.
Really, what we need is a lot more people on all of these areas of direct long-run importance.
For what it’s worth, Nick did a shallow investigation of bunker building and found it was likely not very effective (not that this necessarily argues against general efforts to increase civilisation’s robustness).
I think so! The same goes for bunkers and better facilities for rebuilding from a catastrophe (energy, food, tech for fertility, tech for reindustrialization and literature on these). Same goes for research of mitigation of risks from AI and synthetic pandemics. It seems, as argued in Nick Bekstead’s thesis that these things that have a plausible long-run impact are overwhelmingly important relative to things that don’t. There is some room for argument around the edges about which short-range philanthropic endeavours might have some positive indirect long-run effects but by and large, we have a situation where society has left all of these areas neglected relative to their overwhelming importance. The group of effective altruists and rationalists who are willing to focus on these issues of direct long-run importance is perhaps 200, many of whom are still in school. So we have to prioritize. And that means more than half of effort should go into AI and synthetic pandemics including their policy currently, much of the remainder should go into identifying new technological risks, and bunkers and lunar colonies should have only a couple of percent of our attention.
Really, what we need is a lot more people on all of these areas of direct long-run importance.
For what it’s worth, Nick did a shallow investigation of bunker building and found it was likely not very effective (not that this necessarily argues against general efforts to increase civilisation’s robustness).
Yep, the robustness thing is picked up by Jebari while some GCRI folks give their own remarks in “Isolated refuges for surviving global catastrophes”