A shorter explainer on why focusing on fast growth could be harmful:
Focusing on fast means focusing on spreading ideas fast. Ideas that are fast to spread tend to be 1 dimensional.
Many 1d versions of the EA ideas could do more harm than good. Let’s not do much more harm than good by spreading unhelpful, 1 dimensional takes on extremely complicated and nuanced questions.
Let’s spread 2 dimensional takes on EA that are honest, nuanced and intelligent where people think for themselves.
The 2d takes that include the fundamental concepts (scope insensitivity and cause neutrality etc) that are most robust. One where people recognize no-one has all the answers yet because these are hard questions. Where they also recognize smart people have done some thinking and that is better than no thinking.
Let’s get an enormous EA sooner rather than later.
But not so quickly that we end up accidentally doing a lot more harm than good!
We don’t need everyone to have a 4 dimensional take on EA.
Let’s be more inclusive. No need for all the moral philosophy for these ideas to be constructive.
However, it is easy to give an overly simplistic impression. We are asking some of the hardest questions humanity could ask. How do we make this century go well? What should we do with our careers in light of this?
Let’s be inclusive but slowly enough to give people a nuanced impression. And slowly enough to be some social support to people questioning their past choices and future plans.
This all sounds reasonable. But maybe if we’re clever we’ll find ways to spread EA fast and well. In the possible worlds where UGAP or 80K or EA Virtual Programs or the EA Infrastructure Fund didn’t exist, EA would spread slower, but not really better. Maybe there’s a possible world where more/bigger things like those exist, where EA spreads very fast and well.
I doubt anyone disagrees with either of our above two comments. 🙂
I just have noticed that when people focus on growing faster, they sometimes push for strategies that I think do more harm than good because we all forget the higher level goals mid project.
I’m not against a lot of faster growth strategies than currently get implemented.
I am against focusing on faster growth because the higher level goal of “faster growth” makes it easy to miss some big picture considerations.
A better higher level goal, in my mind, is focus on fundamentals (like scope insensitivity or cause neutrality or the Pareto principal applied to career choice and donations) over conclusions.
I think this would result in faster growth with much less of the downsides I see in focusing on faster growth.
I’m not against faster growth, I am against focusing on it. 🤣
Human psychology is hard to manage. I think we need to have helpful slogans that come easily to mind because none of us are as smart as we think we are. 🤣😅 (I speak from experience 🤣)
Focus on fundamentals. I think that will get us further.
A shorter explainer on why focusing on fast growth could be harmful:
Focusing on fast means focusing on spreading ideas fast. Ideas that are fast to spread tend to be 1 dimensional.
Many 1d versions of the EA ideas could do more harm than good. Let’s not do much more harm than good by spreading unhelpful, 1 dimensional takes on extremely complicated and nuanced questions.
Let’s spread 2 dimensional takes on EA that are honest, nuanced and intelligent where people think for themselves.
The 2d takes that include the fundamental concepts (scope insensitivity and cause neutrality etc) that are most robust. One where people recognize no-one has all the answers yet because these are hard questions. Where they also recognize smart people have done some thinking and that is better than no thinking.
Let’s get an enormous EA sooner rather than later.
But not so quickly that we end up accidentally doing a lot more harm than good!
We don’t need everyone to have a 4 dimensional take on EA.
Let’s be more inclusive. No need for all the moral philosophy for these ideas to be constructive.
However, it is easy to give an overly simplistic impression. We are asking some of the hardest questions humanity could ask. How do we make this century go well? What should we do with our careers in light of this?
Let’s be inclusive but slowly enough to give people a nuanced impression. And slowly enough to be some social support to people questioning their past choices and future plans.
This all sounds reasonable. But maybe if we’re clever we’ll find ways to spread EA fast and well. In the possible worlds where UGAP or 80K or EA Virtual Programs or the EA Infrastructure Fund didn’t exist, EA would spread slower, but not really better. Maybe there’s a possible world where more/bigger things like those exist, where EA spreads very fast and well.
I doubt anyone disagrees with either of our above two comments. 🙂
I just have noticed that when people focus on growing faster, they sometimes push for strategies that I think do more harm than good because we all forget the higher level goals mid project.
I’m not against a lot of faster growth strategies than currently get implemented.
I am against focusing on faster growth because the higher level goal of “faster growth” makes it easy to miss some big picture considerations.
A better higher level goal, in my mind, is focus on fundamentals (like scope insensitivity or cause neutrality or the Pareto principal applied to career choice and donations) over conclusions.
I think this would result in faster growth with much less of the downsides I see in focusing on faster growth.
I’m not against faster growth, I am against focusing on it. 🤣
Human psychology is hard to manage. I think we need to have helpful slogans that come easily to mind because none of us are as smart as we think we are. 🤣😅 (I speak from experience 🤣)
Focus on fundamentals. I think that will get us further.
Agreed.