People who know that they are outliers amongst experts in how likely they think X is (as I think being 99% sure of doom is, particular combined with short-ish timelines), should be cautious about taking extreme actions on the basis of an outlying view, even if they think they have performed a personal adjustment to down-weight their confidence to take account of the fact that other experts disagree, and still ended up north of 99%. Otherwise you get the problem that extreme actions are taken even when most experts think they will be bad. In that sense integrity of the kind your praising is actually potentially very bad and dangerous, even if there are some readings of “rational” on which it counts as rational.
Of course, what Eliezer is doing is not taking extreme actions, but recommending governments do so in certain circumstances, and that is much less obviously a bad thing to do, since govs will also hear from experts who are closer to the median expert.
People who know that they are outliers amongst experts in how likely they think X is (as I think being 99% sure of doom is, particular combined with short-ish timelines), should be cautious about taking extreme actions on the basis of an outlying view, even if they think they have performed a personal adjustment to down-weight their confidence to take account of the fact that other experts disagree, and still ended up north of 99%. Otherwise you get the problem that extreme actions are taken even when most experts think they will be bad. In that sense integrity of the kind your praising is actually potentially very bad and dangerous, even if there are some readings of “rational” on which it counts as rational.
Of course, what Eliezer is doing is not taking extreme actions, but recommending governments do so in certain circumstances, and that is much less obviously a bad thing to do, since govs will also hear from experts who are closer to the median expert.