âThis effect would be exaggerated by the correlation between rationalist culture and alignment thinkingâ
Being part of rationalist culture is a sign that someone highly values rationality, yes. But itâs also a sign that the belong to a relatively small group, with a strong sense of superiority to normies, some experiments in communal living, and a view of those outside the group as often morally and intellectually corrupt (âlow decouplersâ ânot truth-seekingâ etc.) Groups like that are not usually known for dispassionately and objectively looking at the evidence on beliefs that are central to group identity, and belief that AI risk is high seems fairly central to rationalist identity to me. It certainly could be (I mean this non-sarcastically) that rationalist culture is an exception to the general rule because it places such a high value on updating on new evidence and changing your mind, but I donât think we can be confident that rationalists are more likely to evaluate information on AI risk fairly than other people of comparable intelligence. Though I agree they are certainly better informed on AI risk than Good Judgment superforecasters, and as a GJ superforecaster, my views on AI risk have trended towards those of the rationalists recently (though still far away from >50% p|doom).
I agree optimists have other biases too. Including most simply status quo bias, which is, frankly, generally not a âbiasâ at all for most of the things GJ people forecast, so much as a fallible but useful heuristic, but which is probably not a great idea to apply to a genuinely revolutionary new technology.
Agreed on all counts, except that a strong value on rationality seems very likely to be an advantage in on-average reaching more-correct beliefs. Feeling good about changing oneâs mind instead of bad is going to lead to more belief changes, and those tend to lead toward truth.
Good points on the rationalist community being a bit insular. I donât think about that much myself because Iâve never been involved with the bay area rationalist community, just LessWrong.
âThis effect would be exaggerated by the correlation between rationalist culture and alignment thinkingâ
Being part of rationalist culture is a sign that someone highly values rationality, yes. But itâs also a sign that the belong to a relatively small group, with a strong sense of superiority to normies, some experiments in communal living, and a view of those outside the group as often morally and intellectually corrupt (âlow decouplersâ ânot truth-seekingâ etc.) Groups like that are not usually known for dispassionately and objectively looking at the evidence on beliefs that are central to group identity, and belief that AI risk is high seems fairly central to rationalist identity to me. It certainly could be (I mean this non-sarcastically) that rationalist culture is an exception to the general rule because it places such a high value on updating on new evidence and changing your mind, but I donât think we can be confident that rationalists are more likely to evaluate information on AI risk fairly than other people of comparable intelligence. Though I agree they are certainly better informed on AI risk than Good Judgment superforecasters, and as a GJ superforecaster, my views on AI risk have trended towards those of the rationalists recently (though still far away from >50% p|doom).
I agree optimists have other biases too. Including most simply status quo bias, which is, frankly, generally not a âbiasâ at all for most of the things GJ people forecast, so much as a fallible but useful heuristic, but which is probably not a great idea to apply to a genuinely revolutionary new technology.
Agreed on all counts, except that a strong value on rationality seems very likely to be an advantage in on-average reaching more-correct beliefs. Feeling good about changing oneâs mind instead of bad is going to lead to more belief changes, and those tend to lead toward truth.
Good points on the rationalist community being a bit insular. I donât think about that much myself because Iâve never been involved with the bay area rationalist community, just LessWrong.