I think this is perhaps quite a simplistic reading of climate change, and whilst somewhat in line with the “community orthodoxy”, I think this post and that orthodoxy is somewhat misguided.
Firstly, this post broadly ignores the concept of vulnerabilities and exposures in favour of a pure singular hazard model, which whilst broadly in line with the focus of people like Bostrom and Ord, seems overly reductive. Moreover, it seems highly unlikely that even the most damngerous pandemic would actually cause direct human extinction, nor an ordinary nuclear war, meaning a care about only direct X-Risk really should lead to a prioritisation of omnicidal actor, AI risk, and other speculative risks like physics experiments and selfreplicating nano-technology. Even if you focus on the broader hazards category, climate’s role as a risk factor is certainly not to be ignored, in particular I think in increasing risk of conflict and increasing the number of omnicidal actors. It should be noted, however, that X-Risk doesn’t just mean human extinction, but anything which irrepairably reduces the potential of humanity.
Once you are dealing with GCRs and societal collapse, and how this might pose an X-Risk (by conversion to irrecoverable societal collapse, which still needs more work on it), climate change rises in priority. Climate change increasing civilisational vulnerability becomes a much more serious issue, and an increase in natural disasters may be enough to cause cascading failures. If you seriously care about the collapse of our complex system, or collapses that result in mass death (not necessarily synonmous), I think these more reductionist arguments hold less sway. Whilst I won’t go into the long termist argument for this in detail here, if you think it unlikely that societal recovery is in line with what is good (you might be particularly susceptable to this if you are a moral antirealist who thinks your values are mostly arbitrary ) or that societal recovery is reasonably unlikely. It also should be noted that it seems that societies struggle to recover in unstable climates, so climate change may make it even harder for societal recovery. In the article you say that the ability for climate to cause societal collapse is instead a reason to focus on the relationship between food systems and societal collapse, however climate doesn’t just impact our food systems, but a huge amount of our critical systems , and just addressing food supply may lead us still vulnerable to societal collapse. (NB I think these societal collapse tendences of climate change is generally low probability, probably <10%) Climate change related vulnerabilities likely make the conversion of a GCR-> a societal collapse more likely and the conversion from societal collapse->irreversible societal collapse, as well as the conversion of shock-> GCR. Moreover, the literature on systemic risk would probably further elevate the importance of climate change. If you only care about fully wiping humanity out, because you think under almost all scenarios of GCRs/society collapse we recover to the same technological levels + in line with values you agree with, then maybe you can ignore most of this, but I tend to think such an argument is mostly implausible (I won’t give this argument here)
On the topic of neglectedness, it is true that climate change as a whole is not neglected. Nonetheless, potentially high impact interventions on climate may (and may is important), still be available and neglected. Thus, don’t let this general EA advice disuade you if you think you have found soemthing promising. In relation to the funding given to climate change, a lot of that is related to investment in energy generation technologies, and much pays for itself, although general climate investment is outside my area of expertise. Moreover, it is unclear how much more money on AI Safety would massively help us, although this is once again outside my expertise and I know there is a lot of disagreement on this, so take this paragraph with a little pinch of salt.
Finally, this article general presupposes that X-Risk is high at present and that we are at “the hinge of history,” presenting X-Risk work as the only outcome of longtermism. Whilst such may be a common sentiment in the community, it certainly isn’t the only perspective. If for example you think X-Risk in general is low, from other longtermist perspectives, it may be the case that the destabilising effects of climate change on the globe and the global economy is indeed highly important, and then you get into the neglectedness question ie is it easier to stop the4 negative effect of climate change on GDP growth (and many interventions probably increase gdp growth as well) or just focus on gdp growth/. This is certainly not a done question, although I think John Halstead did some stuff on it which I probably need to check.
Whilst I certainly think your argument is useful in parts, including the claim that climate change is probably overhyped, I nonetheless feel you unreasonably suggest climate change is less of an issue than it is. Less focus on the Bostrom/Ord -esque existential hazards may be beneficial, and a greater diversification of viewpoints, including better integration of some of the arguments that the references you cite make.
However, please don’t let the overall critical tone of this comment dissuade you- its awesome to see people new to EA writing such genuinely well researched and well written posts on the forum (I certainly haven’t had the bravery to post something on here yet!) Keep up the good work despite my critiscms.
A better argument against climate change as an EA cause is basically that climate change is solving itself in this century, with clean energy in the entire world by the end of the century nearly guaranteed due to Capitalism (specifically, solar/wind/batteries) at 2C, which is far less of a problem than any GCR. 2C definitely is a problem for moral reasons, but it is very unlikely that it will get to a GCR state.
FIrstly, generally roughly 3C is considered the likely warming if policies continue as they are,not the 2C that you claim. If the world achieves decarbonisation leading to 550PPM (in line with current policies, although 3C rather than the 2C you claim), there is still a fat tail risk, and in fact there is about 10% probability of 6C warming, due to our remaining uncertainty of ECS. This doesn’t meaningfully account for tipping points either, which if we got such warming we would be very likely to hit. If you want to read more on this, either read Wagner & Weitzmann 2015 (its a little old but still very relevant) or just read some of the literature on fat tailed climate risks. 10% chance of above 6C in a very plausible scenario seems an unacceptably high risk. This doesn’t even account for the possibility(although small, nonetheless very far from non-negligable) that we end up following an RCP8.5 pathway, which would be considerably more devastating.
Even if we do end up reaching the agreed upon target of roughly 450PPM (2C levels of CO2 concentrations), there is still a 5% chance of 4 degrees warming and a 1% chance of 5 degrees warming. |The fat tails really magtter (data from Quiggin 2017)
Moreover, to suggest 2C is “very unlikely” to lead to a GCR state perhaps somewhat ignores some of the problems I say in the above response, that the chief issues of climate change are its increase in societal vulnerabilities, possibility of triggering cascading failure, and of converting civilisational collapse to irreversible civilisational collapse. Obviously a lot of this rests on what probabilities you mean; for instyance, if you mean “very unlikely” in the IPCC sense that would imply 0-10% chance, which seems awfully high. I may put 2C being highly significant in leading to a GCR in roughly 1% territory, but certainlyt not terretory that it can be ignored, although I do think most of the GCR risk comes from heavy tailed scenarios detailed above
I think this is perhaps quite a simplistic reading of climate change, and whilst somewhat in line with the “community orthodoxy”, I think this post and that orthodoxy is somewhat misguided.
Firstly, this post broadly ignores the concept of vulnerabilities and exposures in favour of a pure singular hazard model, which whilst broadly in line with the focus of people like Bostrom and Ord, seems overly reductive. Moreover, it seems highly unlikely that even the most damngerous pandemic would actually cause direct human extinction, nor an ordinary nuclear war, meaning a care about only direct X-Risk really should lead to a prioritisation of omnicidal actor, AI risk, and other speculative risks like physics experiments and selfreplicating nano-technology. Even if you focus on the broader hazards category, climate’s role as a risk factor is certainly not to be ignored, in particular I think in increasing risk of conflict and increasing the number of omnicidal actors. It should be noted, however, that X-Risk doesn’t just mean human extinction, but anything which irrepairably reduces the potential of humanity.
Once you are dealing with GCRs and societal collapse, and how this might pose an X-Risk (by conversion to irrecoverable societal collapse, which still needs more work on it), climate change rises in priority. Climate change increasing civilisational vulnerability becomes a much more serious issue, and an increase in natural disasters may be enough to cause cascading failures. If you seriously care about the collapse of our complex system, or collapses that result in mass death (not necessarily synonmous), I think these more reductionist arguments hold less sway. Whilst I won’t go into the long termist argument for this in detail here, if you think it unlikely that societal recovery is in line with what is good (you might be particularly susceptable to this if you are a moral antirealist who thinks your values are mostly arbitrary ) or that societal recovery is reasonably unlikely. It also should be noted that it seems that societies struggle to recover in unstable climates, so climate change may make it even harder for societal recovery. In the article you say that the ability for climate to cause societal collapse is instead a reason to focus on the relationship between food systems and societal collapse, however climate doesn’t just impact our food systems, but a huge amount of our critical systems , and just addressing food supply may lead us still vulnerable to societal collapse. (NB I think these societal collapse tendences of climate change is generally low probability, probably <10%) Climate change related vulnerabilities likely make the conversion of a GCR-> a societal collapse more likely and the conversion from societal collapse->irreversible societal collapse, as well as the conversion of shock-> GCR. Moreover, the literature on systemic risk would probably further elevate the importance of climate change. If you only care about fully wiping humanity out, because you think under almost all scenarios of GCRs/society collapse we recover to the same technological levels + in line with values you agree with, then maybe you can ignore most of this, but I tend to think such an argument is mostly implausible (I won’t give this argument here)
On the topic of neglectedness, it is true that climate change as a whole is not neglected. Nonetheless, potentially high impact interventions on climate may (and may is important), still be available and neglected. Thus, don’t let this general EA advice disuade you if you think you have found soemthing promising. In relation to the funding given to climate change, a lot of that is related to investment in energy generation technologies, and much pays for itself, although general climate investment is outside my area of expertise. Moreover, it is unclear how much more money on AI Safety would massively help us, although this is once again outside my expertise and I know there is a lot of disagreement on this, so take this paragraph with a little pinch of salt.
Finally, this article general presupposes that X-Risk is high at present and that we are at “the hinge of history,” presenting X-Risk work as the only outcome of longtermism. Whilst such may be a common sentiment in the community, it certainly isn’t the only perspective. If for example you think X-Risk in general is low, from other longtermist perspectives, it may be the case that the destabilising effects of climate change on the globe and the global economy is indeed highly important, and then you get into the neglectedness question ie is it easier to stop the4 negative effect of climate change on GDP growth (and many interventions probably increase gdp growth as well) or just focus on gdp growth/. This is certainly not a done question, although I think John Halstead did some stuff on it which I probably need to check.
Whilst I certainly think your argument is useful in parts, including the claim that climate change is probably overhyped, I nonetheless feel you unreasonably suggest climate change is less of an issue than it is. Less focus on the Bostrom/Ord -esque existential hazards may be beneficial, and a greater diversification of viewpoints, including better integration of some of the arguments that the references you cite make.
However, please don’t let the overall critical tone of this comment dissuade you- its awesome to see people new to EA writing such genuinely well researched and well written posts on the forum (I certainly haven’t had the bravery to post something on here yet!) Keep up the good work despite my critiscms.
A better argument against climate change as an EA cause is basically that climate change is solving itself in this century, with clean energy in the entire world by the end of the century nearly guaranteed due to Capitalism (specifically, solar/wind/batteries) at 2C, which is far less of a problem than any GCR. 2C definitely is a problem for moral reasons, but it is very unlikely that it will get to a GCR state.
There are various problems with this.
FIrstly, generally roughly 3C is considered the likely warming if policies continue as they are,not the 2C that you claim. If the world achieves decarbonisation leading to 550PPM (in line with current policies, although 3C rather than the 2C you claim), there is still a fat tail risk, and in fact there is about 10% probability of 6C warming, due to our remaining uncertainty of ECS. This doesn’t meaningfully account for tipping points either, which if we got such warming we would be very likely to hit. If you want to read more on this, either read Wagner & Weitzmann 2015 (its a little old but still very relevant) or just read some of the literature on fat tailed climate risks. 10% chance of above 6C in a very plausible scenario seems an unacceptably high risk. This doesn’t even account for the possibility(although small, nonetheless very far from non-negligable) that we end up following an RCP8.5 pathway, which would be considerably more devastating.
Even if we do end up reaching the agreed upon target of roughly 450PPM (2C levels of CO2 concentrations), there is still a 5% chance of 4 degrees warming and a 1% chance of 5 degrees warming. |The fat tails really magtter (data from Quiggin 2017)
Moreover, to suggest 2C is “very unlikely” to lead to a GCR state perhaps somewhat ignores some of the problems I say in the above response, that the chief issues of climate change are its increase in societal vulnerabilities, possibility of triggering cascading failure, and of converting civilisational collapse to irreversible civilisational collapse. Obviously a lot of this rests on what probabilities you mean; for instyance, if you mean “very unlikely” in the IPCC sense that would imply 0-10% chance, which seems awfully high. I may put 2C being highly significant in leading to a GCR in roughly 1% territory, but certainlyt not terretory that it can be ignored, although I do think most of the GCR risk comes from heavy tailed scenarios detailed above