people reporting very low P(doom) numbers just don’t understand the AI alignment problem
… or disagree with the frame.
Another selection effect is that people who are more interested in xrisk will tend to differentially participate in these forecasts.
https://forum.effectivealtruism.org/posts/EG9xDM8YRz4JN4wMN/samotsvety-s-ai-risk-forecasts might be of interest.
… or disagree with the frame.
Another selection effect is that people who are more interested in xrisk will tend to differentially participate in these forecasts.
https://forum.effectivealtruism.org/posts/EG9xDM8YRz4JN4wMN/samotsvety-s-ai-risk-forecasts might be of interest.