I made this comment with the assumption that some of these people could have extremely valuable skills to offer to the problems this community cares about. These are students at a top uni in the UK for sciences, and many of whom go on to be significantly influential in politics and business, much higher than the base rate at other unis or average population.
I agree not every student fits this category, or is someone who will ever be inclined towards EA ideas. However I don’t know if we are claiming that being in this category (e.g. being in the top N% at Cambridge) correlates with a more positive baseline-impression of EA community building? Maybe the more conscientious people weren’t ringleaders in making the comments, but they will definitely hear them which I think could have social effects.
I agree that EA will not be for everyone, and we should seek good intellectual critiques from those people that disagree on an intellectual basis. But to me the thrust of this post (and the phenomenon I was commenting on) was: there are many people with the ability to solve the worlds biggest problems. It would be a shame to lose their inclination purely due to our CB strategies. If our strategy could be nudged to achieve better impressions at people’s first encounter with EA, we could capture more of this talent and direct them to the world’s biggest problems. Community building strategy feels much more malleable than the content of our ideas or common conclusions, which we might indeed want to be more bullish about.
I do accept the optimal approach to community building will still turn some people off, but it’s worth thinking about this intentionally. As EA grows, CB culture gets harder to fix (if it’s not already too large to change course significantly).
I also didn’t clarify this in my original comment. It was my impression that many of them had had already encountered EA, rather than them having picked this up from the messaging of the table. It’s been too long to confirm for sure now, and more surveying would help to confirm. This would not be surprising though, as EA has a large presence at Cambridge than most other unis (and not everyone at freshers’ fair is a first year, many later-stage students attend to pick up new hobbies or whatever).
But to me the thrust of this post (and the phenomenon I was commenting on) was: there are many people with the ability to solve the worlds biggest problems. It would be a shame to lose their inclination purely due to our CB strategies. If our strategy could be nudged to achieve better impressions at people’s first encounter with EA, we could capture more of this talent and direct them to the world’s biggest problems.
Another way of stating this is that we want to avoid misdirecting talent away from the world’s biggest problems. This might occur if EA has identified those problems, effectively motivates its high-aptitude members to work on them, but fails to recruit the maximum number of high-aptitude members, due to CB strategies optimized for attracting larger numbers of low-aptitude members.
This is clearly a possible failure mode for EA.
The epistemic thrust of the OP is that we may be missing out on information that would allow us to determine whether or not this is so, largely due to selection and streetlamp effects.
Anecdata is a useful starting place for addressing this concern. My objective in my comment above is to point out that this is, in the end, just anecdata, and to question the extent to which we should update on it. I also wanted to focus attention on the people who I expect to have the most valuable insights about how EA could be doing better at attracting high-aptitude members; I expect that most of these people are not the sort of folks who refer to EA as a “cult” from the next table down at a Cambridge fresher’s fair, but I could be wrong about that.
In addition, I want to point out that the character models of “Alice” and “Bob” are the merest speculation. We can spin other stories about “Cindy” and “Dennis” in which the smart, independent-minded skeptic is attracted to EA, and the aimless believer is attracted to some other table at the fresher’s fair. We can also spin stories in which CB folks wind up working to minimize the perception that EA is a cult, and this having a negative impact on high-talent recruitment.
I am very uncertain about all this, and I hope that this comes across as constructive.
I made this comment with the assumption that some of these people could have extremely valuable skills to offer to the problems this community cares about. These are students at a top uni in the UK for sciences, and many of whom go on to be significantly influential in politics and business, much higher than the base rate at other unis or average population.
I agree not every student fits this category, or is someone who will ever be inclined towards EA ideas. However I don’t know if we are claiming that being in this category (e.g. being in the top N% at Cambridge) correlates with a more positive baseline-impression of EA community building? Maybe the more conscientious people weren’t ringleaders in making the comments, but they will definitely hear them which I think could have social effects.
I agree that EA will not be for everyone, and we should seek good intellectual critiques from those people that disagree on an intellectual basis. But to me the thrust of this post (and the phenomenon I was commenting on) was: there are many people with the ability to solve the worlds biggest problems. It would be a shame to lose their inclination purely due to our CB strategies. If our strategy could be nudged to achieve better impressions at people’s first encounter with EA, we could capture more of this talent and direct them to the world’s biggest problems. Community building strategy feels much more malleable than the content of our ideas or common conclusions, which we might indeed want to be more bullish about.
I do accept the optimal approach to community building will still turn some people off, but it’s worth thinking about this intentionally. As EA grows, CB culture gets harder to fix (if it’s not already too large to change course significantly).
I also didn’t clarify this in my original comment. It was my impression that many of them had had already encountered EA, rather than them having picked this up from the messaging of the table. It’s been too long to confirm for sure now, and more surveying would help to confirm. This would not be surprising though, as EA has a large presence at Cambridge than most other unis (and not everyone at freshers’ fair is a first year, many later-stage students attend to pick up new hobbies or whatever).
Another way of stating this is that we want to avoid misdirecting talent away from the world’s biggest problems. This might occur if EA has identified those problems, effectively motivates its high-aptitude members to work on them, but fails to recruit the maximum number of high-aptitude members, due to CB strategies optimized for attracting larger numbers of low-aptitude members.
This is clearly a possible failure mode for EA.
The epistemic thrust of the OP is that we may be missing out on information that would allow us to determine whether or not this is so, largely due to selection and streetlamp effects.
Anecdata is a useful starting place for addressing this concern. My objective in my comment above is to point out that this is, in the end, just anecdata, and to question the extent to which we should update on it. I also wanted to focus attention on the people who I expect to have the most valuable insights about how EA could be doing better at attracting high-aptitude members; I expect that most of these people are not the sort of folks who refer to EA as a “cult” from the next table down at a Cambridge fresher’s fair, but I could be wrong about that.
In addition, I want to point out that the character models of “Alice” and “Bob” are the merest speculation. We can spin other stories about “Cindy” and “Dennis” in which the smart, independent-minded skeptic is attracted to EA, and the aimless believer is attracted to some other table at the fresher’s fair. We can also spin stories in which CB folks wind up working to minimize the perception that EA is a cult, and this having a negative impact on high-talent recruitment.
I am very uncertain about all this, and I hope that this comes across as constructive.