What probability would you assign to some weakened version of (3) being true? By some weakened version, I roughly mean taking the way out of “way too many,” and defining too many as ~ meaningfully above the base rate for people in positions of power/influence.
Agreed on it not being the highest of bars. I felt there was a big gap between your (2) and (3), so was aiming at ~ 2.4 to 2.5: neither peripheral nor widespread, with the understanding that the implied scale is somewhat exponential (because 3 is much worse than 2).
Yeah, I should’ve phrased (3) in a way that’s more likely to pass someone like habryka’s Ideological Turing Test.
Basically, I think if EAs were even just a little worse than typical people in positions of power (on the dimension of integrity), that would be awful news! We really want them to be significantly better.
I think EAs are markedly more likely to be fanatical naive consequentialists, which can be one form of “lacking in integrity” and is the main thing* I’d worry about in terms of me maybe being wrong. To combat that, you need to be above average in integrity on other dimensions.
*Ideology-induced fanaticism is my main concern, but I can think of other concerns as well. EA probably also attracts communal narcissists to some degree, or people who like the thought that they are special and can have lots of impact. Also, according to some studies, utilitarianism correlates with psychopathy at least in trolley problem examples. However, EA very much also (and more strongly?) attracts people who are unusually caring and compassionate. It also motivates people who don’t care about power to seek it, which is an effect with strong potential for making things better.
What probability would you assign to some weakened version of (3) being true? By some weakened version, I roughly mean taking the way out of “way too many,” and defining too many as ~ meaningfully above the base rate for people in positions of power/influence.
10%.
Worth noting that it’s not the highest of bars.
Agreed on it not being the highest of bars. I felt there was a big gap between your (2) and (3), so was aiming at ~ 2.4 to 2.5: neither peripheral nor widespread, with the understanding that the implied scale is somewhat exponential (because 3 is much worse than 2).
Yeah, I should’ve phrased (3) in a way that’s more likely to pass someone like habryka’s Ideological Turing Test.
Basically, I think if EAs were even just a little worse than typical people in positions of power (on the dimension of integrity), that would be awful news! We really want them to be significantly better.
I think EAs are markedly more likely to be fanatical naive consequentialists, which can be one form of “lacking in integrity” and is the main thing* I’d worry about in terms of me maybe being wrong. To combat that, you need to be above average in integrity on other dimensions.
*Ideology-induced fanaticism is my main concern, but I can think of other concerns as well. EA probably also attracts communal narcissists to some degree, or people who like the thought that they are special and can have lots of impact. Also, according to some studies, utilitarianism correlates with psychopathy at least in trolley problem examples. However, EA very much also (and more strongly?) attracts people who are unusually caring and compassionate. It also motivates people who don’t care about power to seek it, which is an effect with strong potential for making things better.