I’m claiming, per the other comment, that relative speed would be both the substantive largest uncertainty, and the largest source of disagreement.
Despite Claim 1, if technology changes rapidly, the emissions and warming levels which are “less uncertain” could change drastically faster, which changes the question in important ways. And I think claim 2 is mistaken in its implication, in that even if the risk of existential catastrophe from AI and biorisk are not obviously several orders of magnitude higher—though I claim that they are—the probability of having radically transformative technology of one the the two types is much less arguably of the same order of magnitude, and that’s the necessary crux.
I’m claiming, per the other comment, that relative speed would be both the substantive largest uncertainty, and the largest source of disagreement.
Despite Claim 1, if technology changes rapidly, the emissions and warming levels which are “less uncertain” could change drastically faster, which changes the question in important ways. And I think claim 2 is mistaken in its implication, in that even if the risk of existential catastrophe from AI and biorisk are not obviously several orders of magnitude higher—though I claim that they are—the probability of having radically transformative technology of one the the two types is much less arguably of the same order of magnitude, and that’s the necessary crux.