I agree that EA being enormous eventually would be very good. 🙂
However, I think there are plenty of ways that quick, short-term growth strategies could end up stunting our growth. 😓
I also think that being much more welcoming might be surprisingly significant due to compounding growth (as I explain below). 🌞
It sounds small, “be more welcoming”, but a small change in angle between two paths can result in a very different end destination. It is absolutely possible for marginal changes to completely change our trajectory!
We probably don’t want effective altruism to lose its nuances. I also think nuanced communication is relatively slow (because it is often best done, at least in part, in many conversations with people in the community)[1]. I think that we could manage a 30% growth rate and keep our community about a nuanced version of effective altruism, but we probably couldn’t triple our community’s size every year and stay nuanced.
However, growth compounds. Growing “only” 30% is not really that slow if we think in decades!
If we grow at a rate of 30% each year, then we’ll be 500,000 times as big in 50 years as we are now.[2]
Obviously growth will taper off (we’re not going to grow exponentially forever), but I think at what point it tapers off is a very big deal. That saturation point, that maximum community size we hit, is more important for EA ending up enormous. We can probably grow by focusing on “slow” growth strategies, and still end up enormous relatively soon (30% is actually very fast growth but can be done without loads of the sorts of activities you might typically think of as fast-growth strategies).[3]
I actually think one of the biggest factors in how big we grow is how good an impression we leave on people who don’t end up in our community. We will taper off earlier if we have a reputation for being unpleasant. We can grow at 30% with local groups doing a lot of the work to leave a lot of people with a great impression whether or not they decide to engage much with the community after they’ve formed that first impression.
If we have a reputation for being a lovely community, we’re much more likely to be able to grow exponentially for a long time.
Therefore, I do think being really nice and welcoming is a really huge deal and more short-term strategies for fast growth that leave people confused and often feeling negatively about us could, in the end, result in our size capping out much earlier.
Whether or not we have the capacity for all the people who could be interested in effective altruism right now (only being able to grow so fast in a nuanced way limits our capacity), we still do have the capacity to leave more people with a good impression.
Books and articles don’t talk back and so can’t explore the various miscellaneous thoughts that pop up for a person who is engaging with EA material, when that material is thought-provoking for them.
This isn’t to say I’m against broad outreach efforts. It’s just to say that it is really important to lay the groundwork for a nuanced impression later on with any broad outreach effort.
I actually think being welcoming to a broad range of people and ideas is really about being focused on conveying to people who are new to effective altruism that the effective altruism project is about a question.
If they don’t agree with the current set of conclusions, that is fine! That’s encouraged, in fact.
People who disagree with our current bottom line conclusions can still be completely on board with the effective altruism project (and decide whether their effective altruism project is helped by engaging with the community for themselves).
If in conversations with new people, the message that we get across is that the bottom-line is not as important as the reasoning processes that get us there, then I think we naturally will be more welcoming to a broader range of people and ideas in a way that is genuine.
Coming across as genuine is such an important part of leaving a good impression so I don’t think we can “pretend” to be broader spectrum than we actually are.
We can be honest about exactly where we are at we are while still encouraging others to take a broader view than us by distinguishing the effective altruism project from the community.
I think there is a tonne of value to making sure we are advocating for the project and not the community in outreach efforts with people who haven’t interacted that much with the community.
If newcomers don’t want to engage with our community, they can still care a tonne about the effective altruism project. They can collaborate with members of the community to the extent it helps them do what they believe is best for helping others as much as they can with whatever resources they are putting towards the effective altruism project.
I’d love to see us become exceptionally good at going down tangents with new people to explore the merits of the ideas they have. This makes them and us way more open to new ideas that are developed in these conversations. It also is a great way to demonstrate how people in this community think to people who haven’t interacted with us much before.
How we think is much more core to effective altruism than any conclusion we have right now (at least as I see it). Showing how this community thinks will, eventually, lead people we have these conversations with to conclusions we’d be interested in anyway (if we’re doing those conversations well).
Strongly agree that being more welcoming is critical! I focused more on the negatives—not being hostile to people who are potential allies, but I definitely think both are important.
That said, I really hate the framing of “not having capacity for people”—we aren’t, or should not be, telling everyone that they need to work at EA organizations to be EA-oriented. Even ignoring the fact that career capital is probably critical for many of the people joining, it’s OK for EAs to have normal jobs and normal careers and donate—and if they are looking for more involvement, reading more, writing / blogging / talking to friends, and attending local meet-ups is a great start.
I consider myself a part of the community and I am not employed in an EA org, nor do I intend to be anytime soon so I know that having an EA job or funding is not needed for that.
I meant the capacity to give people a nuanced enough understanding of the existing ideas and thinking processes as well as the capacity to give people the feeling that this is their community, that they belong in EA spaces, and that they can push back on anything they disagree with.
It’s quite hard to communicate the fundamental ideas and how they link to current conclusions in a nuanced way. I think integrating people into any community in a way that avoids fracturing or without losing the trust that community members have with each other (but still allowing new community members to push back on old ideas that they disagree with) takes time and can only be done, I think, if we grow at a slow enough pace.
(I strongly agree that we should be nice and welcoming. I still think trying to make EA enormous quickly is good if you can identify reasonable such interventions.)
I also think faster is better if the end size of our community stays the same. 👌🏼 I also think it’s possible that faster growth increases the end size of our community too. 🙂
Sorry if my past comment came across a bit harshly (I clearly have just been over-thinking this topic recently 😛)![1]
I do have an intuition, which I explain in more detail below, that lots of ways of growing really fast could end up making our community’s end size smaller. 😟
Therefore, I feel like focusing on fast growth is much less important than focusing on laying the groundwork to have a big end capacity (even if it takes us a while to get there).
It’s so easy to get caught up in short-term metrics so I think bringing the focus to short-term fast growth could take away attention from thinking about whether short-term growth is costing us long-term growth.
I don’t think we’re in danger of disappearing given our current momentum.
I do think we’re in danger of leaving a bad impression on a lot of people though and so I think it is important to manage that as well as we can. My intuition is that it will be easier to work out how to form a good impression if we don’t grow very fast in a very small amount of time.
Having said that, I’m also not against broad outreach efforts. I simply think that when doing broad outreach, it is really important to keep in mind whether the messages being sent out lay the groundwork for a nuanced impression later on (it’s easy to spread memes that makes more nuanced communication much harder).
However, I think memes about us are likely to spread if we’re trying to do big projects that attract media attention, whether or not we are the ones to spread those broad outreach messages.
I could totally buy into it being important to do our best to try and get the broad outreach messages we think are most valuable out there if we’re going to have attention regardless of whether we strategically prepare for it.
I have concrete examples in my post here of what I call “campground impacts” (our impact through our influence on people outside the EA community). If outreach results in a “worse campground”, then I think our community’s net impact will be smaller (so I’m against it). If outreach results in a “better campground”, then I think our community’s net impact will be bigger (so I’m for it). If faster-growth strategies result in a better campground then I’m probably for them, if they result in a worse campground, then I’m probably against them. 😛
I went back and edited it after Zach replied to more accurately convey my vibe but my first draft was all technicalities and no friendly vibes which I think is no way to have a good forum discussion! (sorry!)
(ok, you caught me, I mainly went back to add emojis, but I swear emojis are an integral part of good vibes when discussing complex topics in writing 😛🤣: cartoon facial expressions really do seem better than no facial expressions to convey that I am an actual human being who isn’t actually meaning to be harsh when I just blurt out some unpolished thoughts in a random forum comment😶😔🤔💡😃😊)
A shorter explainer on why focusing on fast growth could be harmful:
Focusing on fast means focusing on spreading ideas fast. Ideas that are fast to spread tend to be 1 dimensional.
Many 1d versions of the EA ideas could do more harm than good. Let’s not do much more harm than good by spreading unhelpful, 1 dimensional takes on extremely complicated and nuanced questions.
Let’s spread 2 dimensional takes on EA that are honest, nuanced and intelligent where people think for themselves.
The 2d takes that include the fundamental concepts (scope insensitivity and cause neutrality etc) that are most robust. One where people recognize no-one has all the answers yet because these are hard questions. Where they also recognize smart people have done some thinking and that is better than no thinking.
Let’s get an enormous EA sooner rather than later.
But not so quickly that we end up accidentally doing a lot more harm than good!
We don’t need everyone to have a 4 dimensional take on EA.
Let’s be more inclusive. No need for all the moral philosophy for these ideas to be constructive.
However, it is easy to give an overly simplistic impression. We are asking some of the hardest questions humanity could ask. How do we make this century go well? What should we do with our careers in light of this?
Let’s be inclusive but slowly enough to give people a nuanced impression. And slowly enough to be some social support to people questioning their past choices and future plans.
This all sounds reasonable. But maybe if we’re clever we’ll find ways to spread EA fast and well. In the possible worlds where UGAP or 80K or EA Virtual Programs or the EA Infrastructure Fund didn’t exist, EA would spread slower, but not really better. Maybe there’s a possible world where more/bigger things like those exist, where EA spreads very fast and well.
I doubt anyone disagrees with either of our above two comments. 🙂
I just have noticed that when people focus on growing faster, they sometimes push for strategies that I think do more harm than good because we all forget the higher level goals mid project.
I’m not against a lot of faster growth strategies than currently get implemented.
I am against focusing on faster growth because the higher level goal of “faster growth” makes it easy to miss some big picture considerations.
A better higher level goal, in my mind, is focus on fundamentals (like scope insensitivity or cause neutrality or the Pareto principal applied to career choice and donations) over conclusions.
I think this would result in faster growth with much less of the downsides I see in focusing on faster growth.
I’m not against faster growth, I am against focusing on it. 🤣
Human psychology is hard to manage. I think we need to have helpful slogans that come easily to mind because none of us are as smart as we think we are. 🤣😅 (I speak from experience 🤣)
Focus on fundamentals. I think that will get us further.
Strong agree; EA being enormous would be good.
I hope we successfully make EA enormous quickly; I hope we pursue making EA enormous interventions beyond just being more welcoming on the margin.
I agree that EA being enormous eventually would be very good. 🙂
However, I think there are plenty of ways that quick, short-term growth strategies could end up stunting our growth. 😓
I also think that being much more welcoming might be surprisingly significant due to compounding growth (as I explain below). 🌞
It sounds small, “be more welcoming”, but a small change in angle between two paths can result in a very different end destination. It is absolutely possible for marginal changes to completely change our trajectory!
We probably don’t want effective altruism to lose its nuances. I also think nuanced communication is relatively slow (because it is often best done, at least in part, in many conversations with people in the community)[1]. I think that we could manage a 30% growth rate and keep our community about a nuanced version of effective altruism, but we probably couldn’t triple our community’s size every year and stay nuanced.
However, growth compounds. Growing “only” 30% is not really that slow if we think in decades!
If we grow at a rate of 30% each year, then we’ll be 500,000 times as big in 50 years as we are now.[2]
Obviously growth will taper off (we’re not going to grow exponentially forever), but I think at what point it tapers off is a very big deal. That saturation point, that maximum community size we hit, is more important for EA ending up enormous. We can probably grow by focusing on “slow” growth strategies, and still end up enormous relatively soon (30% is actually very fast growth but can be done without loads of the sorts of activities you might typically think of as fast-growth strategies).[3]
I actually think one of the biggest factors in how big we grow is how good an impression we leave on people who don’t end up in our community. We will taper off earlier if we have a reputation for being unpleasant. We can grow at 30% with local groups doing a lot of the work to leave a lot of people with a great impression whether or not they decide to engage much with the community after they’ve formed that first impression.
If we have a reputation for being a lovely community, we’re much more likely to be able to grow exponentially for a long time.
Therefore, I do think being really nice and welcoming is a really huge deal and more short-term strategies for fast growth that leave people confused and often feeling negatively about us could, in the end, result in our size capping out much earlier.
Whether or not we have the capacity for all the people who could be interested in effective altruism right now (only being able to grow so fast in a nuanced way limits our capacity), we still do have the capacity to leave more people with a good impression.
More of my thoughts on what could be important to focus on are here.
Books and articles don’t talk back and so can’t explore the various miscellaneous thoughts that pop up for a person who is engaging with EA material, when that material is thought-provoking for them.
1.350≈5×105 (I find the weird neatness of these numbers quite poetic 😻)
This isn’t to say I’m against broad outreach efforts. It’s just to say that it is really important to lay the groundwork for a nuanced impression later on with any broad outreach effort.
I actually think being welcoming to a broad range of people and ideas is really about being focused on conveying to people who are new to effective altruism that the effective altruism project is about a question.
If they don’t agree with the current set of conclusions, that is fine! That’s encouraged, in fact.
People who disagree with our current bottom line conclusions can still be completely on board with the effective altruism project (and decide whether their effective altruism project is helped by engaging with the community for themselves).
If in conversations with new people, the message that we get across is that the bottom-line is not as important as the reasoning processes that get us there, then I think we naturally will be more welcoming to a broader range of people and ideas in a way that is genuine.
Coming across as genuine is such an important part of leaving a good impression so I don’t think we can “pretend” to be broader spectrum than we actually are.
We can be honest about exactly where we are at we are while still encouraging others to take a broader view than us by distinguishing the effective altruism project from the community.
I think there is a tonne of value to making sure we are advocating for the project and not the community in outreach efforts with people who haven’t interacted that much with the community.
If newcomers don’t want to engage with our community, they can still care a tonne about the effective altruism project. They can collaborate with members of the community to the extent it helps them do what they believe is best for helping others as much as they can with whatever resources they are putting towards the effective altruism project.
I’d love to see us become exceptionally good at going down tangents with new people to explore the merits of the ideas they have. This makes them and us way more open to new ideas that are developed in these conversations. It also is a great way to demonstrate how people in this community think to people who haven’t interacted with us much before.
How we think is much more core to effective altruism than any conclusion we have right now (at least as I see it). Showing how this community thinks will, eventually, lead people we have these conversations with to conclusions we’d be interested in anyway (if we’re doing those conversations well).
Strongly agree that being more welcoming is critical! I focused more on the negatives—not being hostile to people who are potential allies, but I definitely think both are important.
That said, I really hate the framing of “not having capacity for people”—we aren’t, or should not be, telling everyone that they need to work at EA organizations to be EA-oriented. Even ignoring the fact that career capital is probably critical for many of the people joining, it’s OK for EAs to have normal jobs and normal careers and donate—and if they are looking for more involvement, reading more, writing / blogging / talking to friends, and attending local meet-ups is a great start.
I agree with that. 🙂
I consider myself a part of the community and I am not employed in an EA org, nor do I intend to be anytime soon so I know that having an EA job or funding is not needed for that.
I meant the capacity to give people a nuanced enough understanding of the existing ideas and thinking processes as well as the capacity to give people the feeling that this is their community, that they belong in EA spaces, and that they can push back on anything they disagree with.
It’s quite hard to communicate the fundamental ideas and how they link to current conclusions in a nuanced way. I think integrating people into any community in a way that avoids fracturing or without losing the trust that community members have with each other (but still allowing new community members to push back on old ideas that they disagree with) takes time and can only be done, I think, if we grow at a slow enough pace.
(I strongly agree that we should be nice and welcoming. I still think trying to make EA enormous quickly is good if you can identify reasonable such interventions.)
I also think faster is better if the end size of our community stays the same. 👌🏼 I also think it’s possible that faster growth increases the end size of our community too. 🙂
Sorry if my past comment came across a bit harshly (I clearly have just been over-thinking this topic recently 😛)![1]
I do have an intuition, which I explain in more detail below, that lots of ways of growing really fast could end up making our community’s end size smaller. 😟
Therefore, I feel like focusing on fast growth is much less important than focusing on laying the groundwork to have a big end capacity (even if it takes us a while to get there).
It’s so easy to get caught up in short-term metrics so I think bringing the focus to short-term fast growth could take away attention from thinking about whether short-term growth is costing us long-term growth.
I don’t think we’re in danger of disappearing given our current momentum.
I do think we’re in danger of leaving a bad impression on a lot of people though and so I think it is important to manage that as well as we can. My intuition is that it will be easier to work out how to form a good impression if we don’t grow very fast in a very small amount of time.
Having said that, I’m also not against broad outreach efforts. I simply think that when doing broad outreach, it is really important to keep in mind whether the messages being sent out lay the groundwork for a nuanced impression later on (it’s easy to spread memes that makes more nuanced communication much harder).
However, I think memes about us are likely to spread if we’re trying to do big projects that attract media attention, whether or not we are the ones to spread those broad outreach messages.
I could totally buy into it being important to do our best to try and get the broad outreach messages we think are most valuable out there if we’re going to have attention regardless of whether we strategically prepare for it.
I have concrete examples in my post here of what I call “campground impacts” (our impact through our influence on people outside the EA community). If outreach results in a “worse campground”, then I think our community’s net impact will be smaller (so I’m against it). If outreach results in a “better campground”, then I think our community’s net impact will be bigger (so I’m for it). If faster-growth strategies result in a better campground then I’m probably for them, if they result in a worse campground, then I’m probably against them. 😛
I went back and edited it after Zach replied to more accurately convey my vibe but my first draft was all technicalities and no friendly vibes which I think is no way to have a good forum discussion! (sorry!)
(ok, you caught me, I mainly went back to add emojis, but I swear emojis are an integral part of good vibes when discussing complex topics in writing 😛🤣: cartoon facial expressions really do seem better than no facial expressions to convey that I am an actual human being who isn’t actually meaning to be harsh when I just blurt out some unpolished thoughts in a random forum comment😶😔🤔💡😃😊)
A shorter explainer on why focusing on fast growth could be harmful:
Focusing on fast means focusing on spreading ideas fast. Ideas that are fast to spread tend to be 1 dimensional.
Many 1d versions of the EA ideas could do more harm than good. Let’s not do much more harm than good by spreading unhelpful, 1 dimensional takes on extremely complicated and nuanced questions.
Let’s spread 2 dimensional takes on EA that are honest, nuanced and intelligent where people think for themselves.
The 2d takes that include the fundamental concepts (scope insensitivity and cause neutrality etc) that are most robust. One where people recognize no-one has all the answers yet because these are hard questions. Where they also recognize smart people have done some thinking and that is better than no thinking.
Let’s get an enormous EA sooner rather than later.
But not so quickly that we end up accidentally doing a lot more harm than good!
We don’t need everyone to have a 4 dimensional take on EA.
Let’s be more inclusive. No need for all the moral philosophy for these ideas to be constructive.
However, it is easy to give an overly simplistic impression. We are asking some of the hardest questions humanity could ask. How do we make this century go well? What should we do with our careers in light of this?
Let’s be inclusive but slowly enough to give people a nuanced impression. And slowly enough to be some social support to people questioning their past choices and future plans.
This all sounds reasonable. But maybe if we’re clever we’ll find ways to spread EA fast and well. In the possible worlds where UGAP or 80K or EA Virtual Programs or the EA Infrastructure Fund didn’t exist, EA would spread slower, but not really better. Maybe there’s a possible world where more/bigger things like those exist, where EA spreads very fast and well.
I doubt anyone disagrees with either of our above two comments. 🙂
I just have noticed that when people focus on growing faster, they sometimes push for strategies that I think do more harm than good because we all forget the higher level goals mid project.
I’m not against a lot of faster growth strategies than currently get implemented.
I am against focusing on faster growth because the higher level goal of “faster growth” makes it easy to miss some big picture considerations.
A better higher level goal, in my mind, is focus on fundamentals (like scope insensitivity or cause neutrality or the Pareto principal applied to career choice and donations) over conclusions.
I think this would result in faster growth with much less of the downsides I see in focusing on faster growth.
I’m not against faster growth, I am against focusing on it. 🤣
Human psychology is hard to manage. I think we need to have helpful slogans that come easily to mind because none of us are as smart as we think we are. 🤣😅 (I speak from experience 🤣)
Focus on fundamentals. I think that will get us further.
Agreed.