In conversations of x-risk, one common mistake seems to be to suggest that we have yet to invent something that kills all people and so the historical record is not on the side of “doomers.” The mistake is survivorship bias, and Ćirković, Sandberg, and Bostrom (2010) call this the Anthropic Shadow. Using base rate frequencies to estimate the probability of events that reduce the number of people (observers), will result in bias.
If there are multiple possible timelines and AI p(doom) is super high (and soon), then we would expect a greater frequency of events that delay the creation of AGI (geopolitical issues, regulation, maybe internal conflicts at AI companies, other disaster, etc.). It might be interesting to see if super forecasters consistently underpredict events that would delay AGI. Although, figuring out how to actually interpret this information would be quite challenging unless it’s blatantly obvious.
I guess more likely is that I’m born in a universe with more people and everything goes fine anyway. This is quite speculative and roughly laid out, but something I’ve been thinking about for a while.
In conversations of x-risk, one common mistake seems to be to suggest that we have yet to invent something that kills all people and so the historical record is not on the side of “doomers.” The mistake is survivorship bias, and Ćirković, Sandberg, and Bostrom (2010) call this the Anthropic Shadow. Using base rate frequencies to estimate the probability of events that reduce the number of people (observers), will result in bias.
If there are multiple possible timelines and AI p(doom) is super high (and soon), then we would expect a greater frequency of events that delay the creation of AGI (geopolitical issues, regulation, maybe internal conflicts at AI companies, other disaster, etc.). It might be interesting to see if super forecasters consistently underpredict events that would delay AGI. Although, figuring out how to actually interpret this information would be quite challenging unless it’s blatantly obvious.
I guess more likely is that I’m born in a universe with more people and everything goes fine anyway. This is quite speculative and roughly laid out, but something I’ve been thinking about for a while.