EDIT: I copied and pasted this comment as a direct reply to Chris and then edited it to make it make more sense than it did the first time I wrote it and also to make it way nicer than my off-the-cuff/figuring-out-what-thought-as-I-went stream-of-consciousness but I left this here anyway partly for context for the later comments and also because I think it’s kind of fun to have a record (even if just for me) of how my thoughts develop as I write/tease out what sounds plausibly true once I’ve written it and what doesn’t quite seem to hit the mark of what intuition I’m attempting to articulate (or what intuition that, once I find a way to articulate it, ends up seeming obviously false once I’ve written it up).
I am not arguing that we should not target exceptional people, I think exceptionally smart and caring people are way better to spend a lot of one-on-one time with than people who care an average amount about helping others and for whom there is a lot of evidence that they haven’t yet got a track record of being able to accomplish things they set their minds to.
My guess is that sometimes we can filter too hard, too early for us to get the tail-end of the effective altruism community’s impact.
It is easy for a person to form an accurate impression of another person who is similar to them. It is much harder for a person to quickly form an accurate impression of another person who is really different (but because of diminishing returns, it seems way more valuable on the margin to get people who are exceptional in a different way to the way that the existing community tends to be exceptional than another person who thinks the same way and has the same skills).
(I am not confident I will reflectively endorse much of the above in 24 hours from now, I’m just sharing my off-the-cusp vibes which might solidify into more or less confidence when I let these thoughts sit for a bit more time)
If my confidence in any of these claims substantially increases or decreases in the next few days I might come back and clarify that (but if doing this becomes a bit of an ugh field, I’m not going to prioritise de-ughing it because there are other ugh-fields that are higher on my list to prioritise de-ughing 😝)
I think there’s a lot of value in people reaching out to people they know (this seems undervalued in EA, then again maybe it’s intentional as evangelism can turn people off). This doesn’t seem to trade-off too substantially against more formal movement-building methods which should probably filter more on which groups are going to be most impactful.
In terms of expanding the range of people and skills in EA, that seems to be happening over time (take for example the EA blog prize: https://effectiveideas.org/ ). Or the increased focus on PA’s (https://pineappleoperations.org/). I have no doubt that there are still many useful skills that we’re missing, but there’s a decent chance that funding would be available if there was a decent team to work on the project.
If my confidence in any of these claims substantially increases or decreases in the next few days I might come back and clarify that (but if doing this becomes a bit of an ugh field, I’m not going to prioritise de-ughing it because there are other ugh-fields that are higher on my list to prioritise de-ughing 😝)
I suspect that some ways we filter at events of existing groups are good and we should keep doing them.
I also suspect some strategies/tendencies we have when we filter at the group level are counter-productive to finding and keeping high-potential people.
For example, filtering too fast based on how quickly someone seems to “get” longtermism might filter in the people who are more willing to defer and so seem like they get it more than they do.
It might filter out the people who are really trying to think it through, who seem more resistant to the ideas or who are more willing to voice their half-formed thoughts that haven’t developed yet into something that deep (because thinking through all the different considerations to form an inside view takes a lot of time and voicing a lot of “dead-end” thoughts). Those higher value people might systematically be classed as “less tractable” or “less smart” when, in fact, it is sometimes[1] that we have just forgotten that people who are really thinking about these ideas seriously, who are smart enough to possibly be a person who could have a tail end impact, are going to say things that don’t sound smart as they navigate what they think. The further someone is from our echo chamber, the stronger I expect this effect to be.
Obviously I don’t know how most groups filter at the group-level, this is so dependent on the particular community organizers (and then also there are maybe some cultural commonalities across the movement which is why I find it tempting to make broad-sweeping generalisations that might not hold in many places).
but obviously not always (and I don’t actually have a clear idea of how big a deal this issue is, I’m just trying to untangle my various intuitions so I can more easily scrutinize if there is a grain of truth in any of them on closer inspection)
Hmm… Some really interesting thoughts. I generally try to determine whether people are actually making considered counter-arguments vs. repeating cliches, but I take your point about a willingness to voice half-formed thoughts can cause others to assume you’re stupid.
I guess in terms of outreach it makes sense to cultivate a sense of practical wisdom so that you can determine when to patiently continue a conversation or when to politely and strategically withdraw so as to save energy and avoid wasting time. This won’t be perfect and it’s subject to biases as you mentioned, but it’s really the best option available.
Hmm, I’m not sure I agree with the claim “it’s really the best option available” even if I don’t already have a better solution pre-thought up. Or at the very least, I think that how to foster this culture might be worth a lot of strategic thought.
Even if there is a decent chance we end up concluding there isn’t all that much we can do, I think the payoff to finding a good way to manage this might be big enough to make up for all the possible worlds where this work ends up being a dead-end.
EDIT: I copied and pasted this comment as a direct reply to Chris and then edited it to make it make more sense than it did the first time I wrote it and also to make it way nicer than my off-the-cuff/figuring-out-what-thought-as-I-went stream-of-consciousness but I left this here anyway partly for context for the later comments and also because I think it’s kind of fun to have a record (even if just for me) of how my thoughts develop as I write/tease out what sounds plausibly true once I’ve written it and what doesn’t quite seem to hit the mark of what intuition I’m attempting to articulate (or what intuition that, once I find a way to articulate it, ends up seeming obviously false once I’ve written it up).
I am not arguing that we should not target exceptional people, I think exceptionally smart and caring people are way better to spend a lot of one-on-one time with than people who care an average amount about helping others and for whom there is a lot of evidence that they haven’t yet got a track record of being able to accomplish things they set their minds to.
My guess is that sometimes we can filter too hard, too early for us to get the tail-end of the effective altruism community’s impact.
It is easy for a person to form an accurate impression of another person who is similar to them. It is much harder for a person to quickly form an accurate impression of another person who is really different (but because of diminishing returns, it seems way more valuable on the margin to get people who are exceptional in a different way to the way that the existing community tends to be exceptional than another person who thinks the same way and has the same skills).
(I am not confident I will reflectively endorse much of the above in 24 hours from now, I’m just sharing my off-the-cusp vibes which might solidify into more or less confidence when I let these thoughts sit for a bit more time)
If my confidence in any of these claims substantially increases or decreases in the next few days I might come back and clarify that (but if doing this becomes a bit of an ugh field, I’m not going to prioritise de-ughing it because there are other ugh-fields that are higher on my list to prioritise de-ughing 😝)
I think there’s a lot of value in people reaching out to people they know (this seems undervalued in EA, then again maybe it’s intentional as evangelism can turn people off). This doesn’t seem to trade-off too substantially against more formal movement-building methods which should probably filter more on which groups are going to be most impactful.
In terms of expanding the range of people and skills in EA, that seems to be happening over time (take for example the EA blog prize: https://effectiveideas.org/ ). Or the increased focus on PA’s (https://pineappleoperations.org/). I have no doubt that there are still many useful skills that we’re missing, but there’s a decent chance that funding would be available if there was a decent team to work on the project.
Makes sense
I suspect that some ways we filter at events of existing groups are good and we should keep doing them.
I also suspect some strategies/tendencies we have when we filter at the group level are counter-productive to finding and keeping high-potential people.
For example, filtering too fast based on how quickly someone seems to “get” longtermism might filter in the people who are more willing to defer and so seem like they get it more than they do.
It might filter out the people who are really trying to think it through, who seem more resistant to the ideas or who are more willing to voice their half-formed thoughts that haven’t developed yet into something that deep (because thinking through all the different considerations to form an inside view takes a lot of time and voicing a lot of “dead-end” thoughts). Those higher value people might systematically be classed as “less tractable” or “less smart” when, in fact, it is sometimes[1] that we have just forgotten that people who are really thinking about these ideas seriously, who are smart enough to possibly be a person who could have a tail end impact, are going to say things that don’t sound smart as they navigate what they think. The further someone is from our echo chamber, the stronger I expect this effect to be.
Obviously I don’t know how most groups filter at the group-level, this is so dependent on the particular community organizers (and then also there are maybe some cultural commonalities across the movement which is why I find it tempting to make broad-sweeping generalisations that might not hold in many places).
but obviously not always (and I don’t actually have a clear idea of how big a deal this issue is, I’m just trying to untangle my various intuitions so I can more easily scrutinize if there is a grain of truth in any of them on closer inspection)
Hmm… Some really interesting thoughts. I generally try to determine whether people are actually making considered counter-arguments vs. repeating cliches, but I take your point about a willingness to voice half-formed thoughts can cause others to assume you’re stupid.
I guess in terms of outreach it makes sense to cultivate a sense of practical wisdom so that you can determine when to patiently continue a conversation or when to politely and strategically withdraw so as to save energy and avoid wasting time. This won’t be perfect and it’s subject to biases as you mentioned, but it’s really the best option available.
Hmm, I’m not sure I agree with the claim “it’s really the best option available” even if I don’t already have a better solution pre-thought up. Or at the very least, I think that how to foster this culture might be worth a lot of strategic thought.
Even if there is a decent chance we end up concluding there isn’t all that much we can do, I think the payoff to finding a good way to manage this might be big enough to make up for all the possible worlds where this work ends up being a dead-end.
Well, if you think of anything, let me know.
👍🏼
Oh, here’s another excellent example, the EA Writing Retreat.
😍
Yeah, this is happening! I also think it helps a lot that Sam BF has a really broad spectrum of ideas take of longtermism, which is really cool!