I wish I could be of help in this, but I just lack the expertise. I think part of the issue is that ‘the consensus’ (as per IPCC reports) don’t model worst-case scenarios, and I think most climate scientists do not predict human extinction from warming, even at extreme levels. It also doesn’t make rational sense why Ord or McAskill would try to ‘outsmart’ the literature: if anything, I’d guess they prefer to be able to include Global Warming among existential risks, as it’s an easy and popular win cause, so my prior is that they do indeed gauge well the expert consensus. Becker’s sources are mostly those two mentioned scientists, which are likely (from a quick glance) to come from collapse-focused research that emphasizes high uncertainty and worst-case feedback loops.
I think most climate scientists do not predict human extinction from warming
I very much agree, and guess Toby’s and Will’s estimates for the existential risk from climate change are much higher than the median expert’s guess for the risk of human extinction from climate change. Toby guessed an existential risk from climate change from 2021 to 2120 of 0.1 %. Richards et al. (2023) estimates “∼6 billion deaths [they say “∼5 billion people” elsewhere] due to starvation by 2100″ for a super unlikely “∼8–12 °C+” of global warming by then, and I think they hugely overestimated the risk. Most importantly, they assumed land use and cropland area to be constant.
Yeah, I think I recall David Thorstad complaining that Ord’s estimate was far too high also.
Be careful not to conflate “existential risk” in the special Bostrom-dervied definition that I think Ord, and probably Will as well, are using with “extinction risk” though. X-risk from climate *can* be far higher than extinction risk, because regressing to a pre-industrial state and then not succeeding in reindustrialising (perhaps because easily accessible coal has been used up), counts as an existential risk, even though it doesn’t involve literal extinction. (Though from memory, I think Ord is quite dismissive of the possibility that there won’t be enough accessible coal to reindustrialise, but I think Will is a bit more concerned about this?)
Thanks for the clarification, David. There are so many concepts of existential risk, and they are often so vague that I think estimates of existential risk can vary by many orders of magnitude even holding constant the definition in words of a given author. So I would prefer discussions to focus on outcomes like human extinction which are well defined, even if their chance remains very hard to estimate.
I also think human extinction without recovery to a similarly promising state is much less likely than human extinction. For a time from human extinction to that kind of recovery described by an exponential distribution with a mean of 66 M years, which was the time from the last mass extinction until humans evolving, and 1 billion years during which the Earth will remain habitable, and therefore recovery is possible, the probability of recovery conditional on human extinction would be 2.63*10^-7 (= e^(-10^9/(66*10^6))).
I have talked to IPCC people. I think there are some double standards required to believe existential risk (as typically defined in the longtermist literature as permanently preventing humanity from reaching future technological maturity) from climate change is considered considerably less likely by climate experts as existential risk from unfriendly artificial general intelligence is considered by AI experts.
What @Manuel Del Río Rodríguez 🔹 call “collapse-focused” views. Most minimally stated, that medium-term involuntary global degrowth is likely if CO2 emissions aren’t strongly curbed in the short term.
Is there actually an official IPCC position on how likely degrowth from climate impacts is? I had a vague sense that they were projecting a higher world gdp in 2100 than now, but when I tried to find evidence of this for 15 minutes or so, I couldn’t actually find any. (I’m aware that even if that is the official IPCC best-guess position that does not necessarily mean that climate experts are less worried about X-risk from climate than AI experts are about X-risk from AI.)
Yeah, I think the problem is surveying experts for their p(doom) isn’t something that has been done with climate experts AFAICT. (I’ll let you decide over whether this should be done or whether Mitchell is right and this methodology is bad to begin with.) But he stated the IPCC is planning to more extensively discuss degrowth in future reports.
That may be true, but it isn’t the argument Becker is making; it would still mean that the book author is at best dissembling when he says that expert consensus on x-risks from global warming is very different from what Ord and MacAskill state.
Two direct quotes: “There are two issues here. The first is that Ord and MacAskill are out of step with the scientific mainstream opinion on the civilizational impacts of extreme climate change. In part, this seems to stem from a failure to imagine how global warming can interact with other risks (itself a wider issue with their program), but it’s also a failure to listen to experts on the subject, even ones they contact themselves”.
“Ord and MacAskill’s confidence that climate change probably doesn’t pose the kind of existential threat they’re worried about is unwarranted. And the fact that they’re primarily worried about existential threats in the first place is the other problem: once a threat has been deemed existential, it’s impossible to outweigh it with any less- than-existential threat in the present day”.
The first one is the clearest pointing in the direction that Ord’s and MacAskil’s estimation aren’t within the pale of scientific mainstream opinion. It connects to a footnote (16) that links to https://digressionsnimpressions.typepad.com/digressionsimpressions/2022/11/on-what-we-owe-the-future-no-not-on-sbfftx.html which is definitely not some summary or compilation of mainstream views on global warming effects, but to a philosopher’s review of What We Owe the Future. Perhaps this is a mistake. Note 14 does link to an article by none other than E. Torres on ‘What “longtermism” gets wrong about climate change’ which seems to be the authority produced for the thesis that Ord and MacAskill’s views are far from the scientific mainstream on this. Torres states having contacted with ‘a number of leading researchers’ he cherrypicks—selective expert sourcing via Torres, not by systematic IPCC consensus.
I wish I could be of help in this, but I just lack the expertise. I think part of the issue is that ‘the consensus’ (as per IPCC reports) don’t model worst-case scenarios, and I think most climate scientists do not predict human extinction from warming, even at extreme levels. It also doesn’t make rational sense why Ord or McAskill would try to ‘outsmart’ the literature: if anything, I’d guess they prefer to be able to include Global Warming among existential risks, as it’s an easy and popular win cause, so my prior is that they do indeed gauge well the expert consensus. Becker’s sources are mostly those two mentioned scientists, which are likely (from a quick glance) to come from collapse-focused research that emphasizes high uncertainty and worst-case feedback loops.
Thanks for the discussion, David and Manuel.
I very much agree, and guess Toby’s and Will’s estimates for the existential risk from climate change are much higher than the median expert’s guess for the risk of human extinction from climate change. Toby guessed an existential risk from climate change from 2021 to 2120 of 0.1 %. Richards et al. (2023) estimates “∼6 billion deaths [they say “∼5 billion people” elsewhere] due to starvation by 2100″ for a super unlikely “∼8–12 °C+” of global warming by then, and I think they hugely overestimated the risk. Most importantly, they assumed land use and cropland area to be constant.
Yeah, I think I recall David Thorstad complaining that Ord’s estimate was far too high also.
Be careful not to conflate “existential risk” in the special Bostrom-dervied definition that I think Ord, and probably Will as well, are using with “extinction risk” though. X-risk from climate *can* be far higher than extinction risk, because regressing to a pre-industrial state and then not succeeding in reindustrialising (perhaps because easily accessible coal has been used up), counts as an existential risk, even though it doesn’t involve literal extinction. (Though from memory, I think Ord is quite dismissive of the possibility that there won’t be enough accessible coal to reindustrialise, but I think Will is a bit more concerned about this?)
Thanks for the clarification, David. There are so many concepts of existential risk, and they are often so vague that I think estimates of existential risk can vary by many orders of magnitude even holding constant the definition in words of a given author. So I would prefer discussions to focus on outcomes like human extinction which are well defined, even if their chance remains very hard to estimate.
I also think human extinction without recovery to a similarly promising state is much less likely than human extinction. For a time from human extinction to that kind of recovery described by an exponential distribution with a mean of 66 M years, which was the time from the last mass extinction until humans evolving, and 1 billion years during which the Earth will remain habitable, and therefore recovery is possible, the probability of recovery conditional on human extinction would be 2.63*10^-7 (= e^(-10^9/(66*10^6))).
I have talked to IPCC people. I think there are some double standards required to believe existential risk (as typically defined in the longtermist literature as permanently preventing humanity from reaching future technological maturity) from climate change is considered considerably less likely by climate experts as existential risk from unfriendly artificial general intelligence is considered by AI experts.
What did the IPCC people say exactly?
What @Manuel Del Río Rodríguez 🔹 call “collapse-focused” views. Most minimally stated, that medium-term involuntary global degrowth is likely if CO2 emissions aren’t strongly curbed in the short term.
Is there actually an official IPCC position on how likely degrowth from climate impacts is? I had a vague sense that they were projecting a higher world gdp in 2100 than now, but when I tried to find evidence of this for 15 minutes or so, I couldn’t actually find any. (I’m aware that even if that is the official IPCC best-guess position that does not necessarily mean that climate experts are less worried about X-risk from climate than AI experts are about X-risk from AI.)
Yeah, I think the problem is surveying experts for their p(doom) isn’t something that has been done with climate experts AFAICT. (I’ll let you decide over whether this should be done or whether Mitchell is right and this methodology is bad to begin with.) But he stated the IPCC is planning to more extensively discuss degrowth in future reports.
That may be true, but it isn’t the argument Becker is making; it would still mean that the book author is at best dissembling when he says that expert consensus on x-risks from global warming is very different from what Ord and MacAskill state.
“dissembling”?
Two direct quotes: “There are two issues here. The first is that Ord and MacAskill are out of step with the scientific mainstream opinion on the civilizational impacts of extreme climate change. In part, this seems to stem from a failure to imagine how global warming can interact with other risks (itself a wider issue with their program), but it’s also a failure to listen to experts on the subject, even ones they contact themselves”.
“Ord and MacAskill’s confidence that climate change probably doesn’t pose the kind of existential threat they’re worried about is unwarranted. And the fact that they’re primarily worried about existential threats in the first place is the other problem: once a threat has been deemed existential, it’s impossible to outweigh it with any less- than-existential threat in the present day”.
The first one is the clearest pointing in the direction that Ord’s and MacAskil’s estimation aren’t within the pale of scientific mainstream opinion. It connects to a footnote (16) that links to https://digressionsnimpressions.typepad.com/digressionsimpressions/2022/11/on-what-we-owe-the-future-no-not-on-sbfftx.html which is definitely not some summary or compilation of mainstream views on global warming effects, but to a philosopher’s review of What We Owe the Future. Perhaps this is a mistake. Note 14 does link to an article by none other than E. Torres on ‘What “longtermism” gets wrong about climate change’ which seems to be the authority produced for the thesis that Ord and MacAskill’s views are far from the scientific mainstream on this. Torres states having contacted with ‘a number of leading researchers’ he cherrypicks—selective expert sourcing via Torres, not by systematic IPCC consensus.