Thanks for the reply. I had no idea the spread was so wide (<2% to >98% in the last link you mentioned)!
I guess the nice thing about most of these estimates is they are still well above the ridiculously low orders of magnitude that might prompt a sense of ‘wait, I should actually upper-bound my estimate of humanity’s future QALYs in order to avoid getting mugged by Pascal.’ It’s a pretty firm foundation for longtermism imo.
One quick question about your post—you mention that some in the community think there is virtually no chance of humanity surviving AGI and cite an April Fool’s Day post. (https://www.lesswrong.com/posts/j9Q8bRmwCgXRYAgcJ/miri-announces-new-death-with-dignity-strategy) I’m not sure if I’m missing some social context behind this post, but have others claimed that AGI is basically certain to cause an extinction event in a non-joking manner?