I think the value of information is really high for the Future Fund. If p(doom) is really high (e.g., the largest prize is claimed), they might decide to almost exclusively focus on AI stuff — this would be a major organizational change that (potentially/hopefully) would help with AI risk reduction quite a bit.
I think the value of information is really high for the Future Fund. If p(doom) is really high (e.g., the largest prize is claimed), they might decide to almost exclusively focus on AI stuff — this would be a major organizational change that (potentially/hopefully) would help with AI risk reduction quite a bit.
Mh, agreed. The general arguments in the post are probably overwhelmed in most cases by considerations specific to each case.