Economic degrowth is undesirable, vs pausing AI is at least arguably desirable – climate change is very unlikely to lead to a literal existential catastrophe, “business as usual” tech improvements and policy changes (i.e., without overthrowing capitalism) will likely lead to a clean energy transition as is, economic degrowth would probably kill many more people than it would save, etc. Meanwhile, AI presents large existential risk in my mind, and I think a pause would probably lower this risk by a non-negligible amount.
Economic degrowth is much less politically feasible than an AI pause – first, because people are loss averse, so degrowth would be taking something away from them vs a pause would just be asking them to forgo future progress; second, because the fear of actual existential risk (from AI) may motivate more extreme policies.
I will say, if I thought p(doom | climate) > 10%, with climate timelines of 12 years, then I would be in favor of degrowth policies that seemed likely to reduce this risk. I just think that in reality, the situation is very different than this.
The issue is not only climate change, here. We are in dangerous territory for most of the planetary boundaries.
AI presents large existential risk in my mind
One of the points is that EAs do not seem to engage with large close-to-existential risks in the minds of degrowthers and the like. It is true that they do not have fleshed out to what extent their fears are existential, but this is because they are large enough for worrying them. See “Is this risk actually existential?” may be less important than we think.
I like your second point. But still, even if it is less politically feasible, as you say, if the risk is large enough EA should be in favour of degrowth. My point is that very little effort has been done to address this if.
In my mind there are 2 main differences:
Economic degrowth is undesirable, vs pausing AI is at least arguably desirable – climate change is very unlikely to lead to a literal existential catastrophe, “business as usual” tech improvements and policy changes (i.e., without overthrowing capitalism) will likely lead to a clean energy transition as is, economic degrowth would probably kill many more people than it would save, etc. Meanwhile, AI presents large existential risk in my mind, and I think a pause would probably lower this risk by a non-negligible amount.
Economic degrowth is much less politically feasible than an AI pause – first, because people are loss averse, so degrowth would be taking something away from them vs a pause would just be asking them to forgo future progress; second, because the fear of actual existential risk (from AI) may motivate more extreme policies.
I will say, if I thought p(doom | climate) > 10%, with climate timelines of 12 years, then I would be in favor of degrowth policies that seemed likely to reduce this risk. I just think that in reality, the situation is very different than this.
The issue is not only climate change, here. We are in dangerous territory for most of the planetary boundaries.
One of the points is that EAs do not seem to engage with large close-to-existential risks in the minds of degrowthers and the like. It is true that they do not have fleshed out to what extent their fears are existential, but this is because they are large enough for worrying them. See “Is this risk actually existential?” may be less important than we think.
I like your second point. But still, even if it is less politically feasible, as you say, if the risk is large enough EA should be in favour of degrowth. My point is that very little effort has been done to address this if.