I’m quite excited to see an impassioned case for more of a focus on systemic change in EA.
I used to be quite excited about interventions targeting growth or innovation, but I’ve recently been more worried about accelerating technological risks. Specific things that I expect accelerated growth to effect negatively include:
Climate Change
AGI Risk
Nuclear and Biological Weapons Research
Cheaper weapons in general
Curious about your thoughts on the potential harm that could come if the growth interventions are indeed successful.
I do think this is a concern that we need to consider carefully. On the standard FHI/Open Phil view of ex risk, AI and bio account for most of the ex risk we face this century. I find it difficult to see how increasing economic development LMICs could affect AI risk. China’s massive growth is something of a special case on the AI risk front I think.
I think growth probably reduces biorisk by increasing the capacity of health systems in poor countries. It seems that leading edge bioscience research is most likely to happen in advanced economies.
On climate, it seems clear that it would exacerbate climate change, but it would also increase the capacity of very poor countries to deal with climate change. Most of the up to 2100 damages seem to me to stem from dryer dry places and wetter wet places, and I think economic development is a good way to deal with these problems for poor countries—they can do desalination, more efficient agriculture, and build flood defences. It would of course be better if they did this with clean energy, but it seems that working on that separately is the best way forward. It’s not like stopping Africa growing is a top priority for environmentalists.
On nuclear, economic growth is a major risk factor for nuclear weapons status, much more important than other factors people often talk about such as pursuing a civilian nuclear power programme. But the ex risk of nuclear war is debatable and seems to stem from the unique features of US v Russia tensions—it seems v unlikely that today’s LMICs would come to possess thousands of warheads.
On the alternative boring long-termist view, these risks seem a much weaker concern.
Generally, I disagree with Cowen that increasing growth is the best thing to do from a long-termist point of view. Though, as we argue, it does seem good from a person-affecting point of view
I think catch-up growth in developing countries, based on adopting existing technologies, would have positive effects on climate change, AI risk, etc. In contrast, ‘frontier’ growth in developed countries is based on technological innovation, and is potentially more dangerous.
I think catch-up growth in developing countries, based on adopting existing technologies, would have positive effects on climate change, AI risk, etc. I think catch-up growth in developing countries, based on adopting existing technologies, would have positive effects on climate change, AI risk, etc.
I’m curious about the intuitions behind this. I think developing countries with fast growth have historically had quite high pollution and carbon output. I also think that more countries joining the “developed” category could quite possibly make coordination around technological risks harder.
I think what you’re saying is plausible but I don’t know of the arguments for that case.
If the case for growth in rich and poor is very different (possibly negative in the one but not the other case), then it starts to matter a lot whether we can promote growth in poor countries without promoting growth in rich countries as a side-effect. I don’t know how the proposed interventions fare in this respect?
I’m quite excited to see an impassioned case for more of a focus on systemic change in EA.
I used to be quite excited about interventions targeting growth or innovation, but I’ve recently been more worried about accelerating technological risks. Specific things that I expect accelerated growth to effect negatively include:
Climate Change
AGI Risk
Nuclear and Biological Weapons Research
Cheaper weapons in general
Curious about your thoughts on the potential harm that could come if the growth interventions are indeed successful.
I do think this is a concern that we need to consider carefully. On the standard FHI/Open Phil view of ex risk, AI and bio account for most of the ex risk we face this century. I find it difficult to see how increasing economic development LMICs could affect AI risk. China’s massive growth is something of a special case on the AI risk front I think.
I think growth probably reduces biorisk by increasing the capacity of health systems in poor countries. It seems that leading edge bioscience research is most likely to happen in advanced economies.
On climate, it seems clear that it would exacerbate climate change, but it would also increase the capacity of very poor countries to deal with climate change. Most of the up to 2100 damages seem to me to stem from dryer dry places and wetter wet places, and I think economic development is a good way to deal with these problems for poor countries—they can do desalination, more efficient agriculture, and build flood defences. It would of course be better if they did this with clean energy, but it seems that working on that separately is the best way forward. It’s not like stopping Africa growing is a top priority for environmentalists.
On nuclear, economic growth is a major risk factor for nuclear weapons status, much more important than other factors people often talk about such as pursuing a civilian nuclear power programme. But the ex risk of nuclear war is debatable and seems to stem from the unique features of US v Russia tensions—it seems v unlikely that today’s LMICs would come to possess thousands of warheads.
On the alternative boring long-termist view, these risks seem a much weaker concern.
Generally, I disagree with Cowen that increasing growth is the best thing to do from a long-termist point of view. Though, as we argue, it does seem good from a person-affecting point of view
I think catch-up growth in developing countries, based on adopting existing technologies, would have positive effects on climate change, AI risk, etc. In contrast, ‘frontier’ growth in developed countries is based on technological innovation, and is potentially more dangerous.
I’m curious about the intuitions behind this. I think developing countries with fast growth have historically had quite high pollution and carbon output. I also think that more countries joining the “developed” category could quite possibly make coordination around technological risks harder.
I think what you’re saying is plausible but I don’t know of the arguments for that case.
If the case for growth in rich and poor is very different (possibly negative in the one but not the other case), then it starts to matter a lot whether we can promote growth in poor countries without promoting growth in rich countries as a side-effect. I don’t know how the proposed interventions fare in this respect?