I do think this is a concern that we need to consider carefully. On the standard FHI/Open Phil view of ex risk, AI and bio account for most of the ex risk we face this century. I find it difficult to see how increasing economic development LMICs could affect AI risk. China’s massive growth is something of a special case on the AI risk front I think.
I think growth probably reduces biorisk by increasing the capacity of health systems in poor countries. It seems that leading edge bioscience research is most likely to happen in advanced economies.
On climate, it seems clear that it would exacerbate climate change, but it would also increase the capacity of very poor countries to deal with climate change. Most of the up to 2100 damages seem to me to stem from dryer dry places and wetter wet places, and I think economic development is a good way to deal with these problems for poor countries—they can do desalination, more efficient agriculture, and build flood defences. It would of course be better if they did this with clean energy, but it seems that working on that separately is the best way forward. It’s not like stopping Africa growing is a top priority for environmentalists.
On nuclear, economic growth is a major risk factor for nuclear weapons status, much more important than other factors people often talk about such as pursuing a civilian nuclear power programme. But the ex risk of nuclear war is debatable and seems to stem from the unique features of US v Russia tensions—it seems v unlikely that today’s LMICs would come to possess thousands of warheads.
On the alternative boring long-termist view, these risks seem a much weaker concern.
Generally, I disagree with Cowen that increasing growth is the best thing to do from a long-termist point of view. Though, as we argue, it does seem good from a person-affecting point of view
I do think this is a concern that we need to consider carefully. On the standard FHI/Open Phil view of ex risk, AI and bio account for most of the ex risk we face this century. I find it difficult to see how increasing economic development LMICs could affect AI risk. China’s massive growth is something of a special case on the AI risk front I think.
I think growth probably reduces biorisk by increasing the capacity of health systems in poor countries. It seems that leading edge bioscience research is most likely to happen in advanced economies.
On climate, it seems clear that it would exacerbate climate change, but it would also increase the capacity of very poor countries to deal with climate change. Most of the up to 2100 damages seem to me to stem from dryer dry places and wetter wet places, and I think economic development is a good way to deal with these problems for poor countries—they can do desalination, more efficient agriculture, and build flood defences. It would of course be better if they did this with clean energy, but it seems that working on that separately is the best way forward. It’s not like stopping Africa growing is a top priority for environmentalists.
On nuclear, economic growth is a major risk factor for nuclear weapons status, much more important than other factors people often talk about such as pursuing a civilian nuclear power programme. But the ex risk of nuclear war is debatable and seems to stem from the unique features of US v Russia tensions—it seems v unlikely that today’s LMICs would come to possess thousands of warheads.
On the alternative boring long-termist view, these risks seem a much weaker concern.
Generally, I disagree with Cowen that increasing growth is the best thing to do from a long-termist point of view. Though, as we argue, it does seem good from a person-affecting point of view