But to me the thrust of this post (and the phenomenon I was commenting on) was: there are many people with the ability to solve the worlds biggest problems. It would be a shame to lose their inclination purely due to our CB strategies. If our strategy could be nudged to achieve better impressions at people’s first encounter with EA, we could capture more of this talent and direct them to the world’s biggest problems.
Another way of stating this is that we want to avoid misdirecting talent away from the world’s biggest problems. This might occur if EA has identified those problems, effectively motivates its high-aptitude members to work on them, but fails to recruit the maximum number of high-aptitude members, due to CB strategies optimized for attracting larger numbers of low-aptitude members.
This is clearly a possible failure mode for EA.
The epistemic thrust of the OP is that we may be missing out on information that would allow us to determine whether or not this is so, largely due to selection and streetlamp effects.
Anecdata is a useful starting place for addressing this concern. My objective in my comment above is to point out that this is, in the end, just anecdata, and to question the extent to which we should update on it. I also wanted to focus attention on the people who I expect to have the most valuable insights about how EA could be doing better at attracting high-aptitude members; I expect that most of these people are not the sort of folks who refer to EA as a “cult” from the next table down at a Cambridge fresher’s fair, but I could be wrong about that.
In addition, I want to point out that the character models of “Alice” and “Bob” are the merest speculation. We can spin other stories about “Cindy” and “Dennis” in which the smart, independent-minded skeptic is attracted to EA, and the aimless believer is attracted to some other table at the fresher’s fair. We can also spin stories in which CB folks wind up working to minimize the perception that EA is a cult, and this having a negative impact on high-talent recruitment.
I am very uncertain about all this, and I hope that this comes across as constructive.
Another way of stating this is that we want to avoid misdirecting talent away from the world’s biggest problems. This might occur if EA has identified those problems, effectively motivates its high-aptitude members to work on them, but fails to recruit the maximum number of high-aptitude members, due to CB strategies optimized for attracting larger numbers of low-aptitude members.
This is clearly a possible failure mode for EA.
The epistemic thrust of the OP is that we may be missing out on information that would allow us to determine whether or not this is so, largely due to selection and streetlamp effects.
Anecdata is a useful starting place for addressing this concern. My objective in my comment above is to point out that this is, in the end, just anecdata, and to question the extent to which we should update on it. I also wanted to focus attention on the people who I expect to have the most valuable insights about how EA could be doing better at attracting high-aptitude members; I expect that most of these people are not the sort of folks who refer to EA as a “cult” from the next table down at a Cambridge fresher’s fair, but I could be wrong about that.
In addition, I want to point out that the character models of “Alice” and “Bob” are the merest speculation. We can spin other stories about “Cindy” and “Dennis” in which the smart, independent-minded skeptic is attracted to EA, and the aimless believer is attracted to some other table at the fresher’s fair. We can also spin stories in which CB folks wind up working to minimize the perception that EA is a cult, and this having a negative impact on high-talent recruitment.
I am very uncertain about all this, and I hope that this comes across as constructive.