One quick question about your post—you mention that some in the community think there is virtually no chance of humanity surviving AGI and cite an April Fool’s Day post. (https://www.lesswrong.com/posts/j9Q8bRmwCgXRYAgcJ/miri-announces-new-death-with-dignity-strategy) I’m not sure if I’m missing some social context behind this post, but have others claimed that AGI is basically certain to cause an extinction event in a non-joking manner?
Good question. Short answer: despite being an April Fools post, that post seems to encapsulate much of what Yudkowski actually believes – so the social context is that the post is joking in its tone and content but not so much the attitude of the author; sorry I can’t link to anything to further substantiate this. I believe Yudkowski’s general policy is to not put numbers on his estimates.
Better answer: Here is a somewhat up-to-date database about predictions about existential risk chances from some folks in the community. You’ll notice these are far below near-certainty.
One of the studies listed in the database is this one in which there are a few researchers who put the chance of doom pretty high.
Thanks for the reply. I had no idea the spread was so wide (<2% to >98% in the last link you mentioned)!
I guess the nice thing about most of these estimates is they are still well above the ridiculously low orders of magnitude that might prompt a sense of ‘wait, I should actually upper-bound my estimate of humanity’s future QALYs in order to avoid getting mugged by Pascal.’ It’s a pretty firm foundation for longtermism imo.
One quick question about your post—you mention that some in the community think there is virtually no chance of humanity surviving AGI and cite an April Fool’s Day post. (https://www.lesswrong.com/posts/j9Q8bRmwCgXRYAgcJ/miri-announces-new-death-with-dignity-strategy) I’m not sure if I’m missing some social context behind this post, but have others claimed that AGI is basically certain to cause an extinction event in a non-joking manner?
Good question. Short answer: despite being an April Fools post, that post seems to encapsulate much of what Yudkowski actually believes – so the social context is that the post is joking in its tone and content but not so much the attitude of the author; sorry I can’t link to anything to further substantiate this. I believe Yudkowski’s general policy is to not put numbers on his estimates.
Better answer: Here is a somewhat up-to-date database about predictions about existential risk chances from some folks in the community. You’ll notice these are far below near-certainty.
One of the studies listed in the database is this one in which there are a few researchers who put the chance of doom pretty high.
Thanks for the reply. I had no idea the spread was so wide (<2% to >98% in the last link you mentioned)!
I guess the nice thing about most of these estimates is they are still well above the ridiculously low orders of magnitude that might prompt a sense of ‘wait, I should actually upper-bound my estimate of humanity’s future QALYs in order to avoid getting mugged by Pascal.’ It’s a pretty firm foundation for longtermism imo.