Thanks Nathan. I definitely see the tensions here. Hopefully these clarifications will help :)
I reckon it’s better if we focus on being a smaller highly engaged community rather than a really big one.
My central claim isn’t about the size of the community, it’s about the diversity of EA that we present to the world (and represent within EA) and staying true to the core question not a particular set of conclusions.
It depends on what you mean by “focus” too. The community will always be some degree of concentric circles of engagement. The total size and relative distribution of engagement will vary depending on what we focus on. My central claim is that the total impact of the community will be higher if the community remains a “big tent” that sticks to the core question of EA. The mechanism is that we create more engagement within each level of engagement, with more allies and fewer adversaries.
Do low engagement people become high engagement.
I’ve never seen someone become high engagement instantly. I’ve only seen engagement as something that increases incrementally (sometimes fast, sometimes slow, sometimes hit’s a point and tapers off, and sadly sometimes high engagement turns to high anti-engagement).
I don’t think that EA is for everyone. Again this clashes with what i said above, but I think that it can be harder for people who leave a community after some time than those who are rejected at the door. If my above point is correct, then there should be some way to signal to people that EA is for people who want to really engage and that it may not be for everyone
Depends on what you mean by EA. In my conception (and the conception I advocate for) everyone is an effective altruist to some extent sometimes and nobody is entirely an effective altruist ever. Effective altruism is a way of thinking not an identity. Some people are part of the “EA community” while some people eschew the label and community yet have much higher impact than most people within the “EA community” because they’ve interrogated big world problems and taken significant positive actions.
Thanks Nathan. I definitely see the tensions here. Hopefully these clarifications will help :)
My central claim isn’t about the size of the community, it’s about the diversity of EA that we present to the world (and represent within EA) and staying true to the core question not a particular set of conclusions.
It depends on what you mean by “focus” too. The community will always be some degree of concentric circles of engagement. The total size and relative distribution of engagement will vary depending on what we focus on. My central claim is that the total impact of the community will be higher if the community remains a “big tent” that sticks to the core question of EA. The mechanism is that we create more engagement within each level of engagement, with more allies and fewer adversaries.
I’ve never seen someone become high engagement instantly. I’ve only seen engagement as something that increases incrementally (sometimes fast, sometimes slow, sometimes hit’s a point and tapers off, and sadly sometimes high engagement turns to high anti-engagement).
Depends on what you mean by EA. In my conception (and the conception I advocate for) everyone is an effective altruist to some extent sometimes and nobody is entirely an effective altruist ever. Effective altruism is a way of thinking not an identity. Some people are part of the “EA community” while some people eschew the label and community yet have much higher impact than most people within the “EA community” because they’ve interrogated big world problems and taken significant positive actions.