I came here after you did and don’t have an answer, but I wanted to comment on this:
One story, the most flattering to EA, goes like this:
“EA is unusually good at ‘epistemics’ / thinking about things, because of its culture and/or who it selects for; and also the community isn’t corrupted too badly by random founder effects and information cascades; and so the best ideas gradually won out among those who were well-known for being reasonable, and who spent tons of time thinking about the ideas. (E.g. Toby Ord convincing Will MacAskill, and a bit later Holden Karnofsky joining them.)”
Can anyone give any outside-view reason to think EA is “unusually good at ‘epistemics’ / thinking about things”, or that “the community isn’t corrupted too badly by random founder effects and information cascades”?
Pet peeve: “spent tons of time thinking about X” is a phrase I encounter often in EA, and for some reason it’s taken to mean “have reached conclusions which are more likely to be true than those of relevant outside experts”. I think time spent thinking about something is very much not indicative of being right about it. MacAskill and Ord, in my view, get some credit for their ideas as they are actual philosophers with the right qualifications for this job—not because they spent lots of time on it.
I’m not writing this as criticism of OP, as the story was given as a maximally charitable take on EA. What I’m saying is I think that story is extremely unrealistic.
Can anyone give any outside-view reason to think EA is “unusually good at ‘epistemics’ / thinking about things” [...]?
Here are some possible outside-view reasons, not saying any of them is necessarily true (though I suspect some probably are):
Maybe EAs (on average) have higher educational attainment than the population at large, and having higher educational attainment is correlated with better epistemics.
Maybe EAs write and read more about epistemics and related topics than the population at large, and …
Maybe EAs would score better on a battery of forecasting questions than the population at large, and …
Maybe EAs are higher earners than the population at large, and …
Maybe EAs read more philosophy than the population at large, and …
Of course it depends which group you compare to, and which thing people are meant to be thinking about.
Thanks. I was thinking more of the scientific establishment, or other professional communities and advocacy groups, or organisations like the Gates Foundation. Most of whom seem to have very different ideas from EA in some areas at least.
Edit to add: note that the claim is that EA is unusually good at these things.
Can anyone give any outside-view reason to think EA is “unusually good at ‘epistemics’ / thinking about things”, or that “the community isn’t corrupted too badly by random founder effects and information cascades”?
I came here after you did and don’t have an answer, but I wanted to comment on this:
Can anyone give any outside-view reason to think EA is “unusually good at ‘epistemics’ / thinking about things”, or that “the community isn’t corrupted too badly by random founder effects and information cascades”?
Pet peeve: “spent tons of time thinking about X” is a phrase I encounter often in EA, and for some reason it’s taken to mean “have reached conclusions which are more likely to be true than those of relevant outside experts”. I think time spent thinking about something is very much not indicative of being right about it. MacAskill and Ord, in my view, get some credit for their ideas as they are actual philosophers with the right qualifications for this job—not because they spent lots of time on it.
I’m not writing this as criticism of OP, as the story was given as a maximally charitable take on EA. What I’m saying is I think that story is extremely unrealistic.
Here are some possible outside-view reasons, not saying any of them is necessarily true (though I suspect some probably are):
Maybe EAs (on average) have higher educational attainment than the population at large, and having higher educational attainment is correlated with better epistemics.
Maybe EAs write and read more about epistemics and related topics than the population at large, and …
Maybe EAs would score better on a battery of forecasting questions than the population at large, and …
Maybe EAs are higher earners than the population at large, and …
Maybe EAs read more philosophy than the population at large, and …
Of course it depends which group you compare to, and which thing people are meant to be thinking about.
Thanks. I was thinking more of the scientific establishment, or other professional communities and advocacy groups, or organisations like the Gates Foundation. Most of whom seem to have very different ideas from EA in some areas at least.
Edit to add: note that the claim is that EA is unusually good at these things.
Btw, I’m not sure why your comment got downvoted (I upvoted it), and would be curious to hear the reasoning of someone who downvoted.
I have some evidence that it isn’t: a commonly cited argument for the importance of AI research says nothing like what ~20-80% effective altruists think it does.