In fact, I think that it’s harder to get a very big (or very fast-growing) set of people to do the “reason and evidence” thing well. I think that reasoning carefully is very hard, and building a community that reasons well together is very hard.
I am very keen for EA to be about the “reason and evidence” thing, rather than about specific answers. But in order to do this, I think that we need to grow cautiously (maybe around 30%/year) and in a pretty thoughtful way.
I think that it’s harder very big (or very fast-growing) set of people to do the “reason and evidence” thing well. I think that reasoning carefully is very hard, and building a community that reasons well together is very hard.
I agree with this. I think it’s even harder to build a community that reasons well together when we come across dogmatically (and we risk cultivating an echo chamber).
Note: I do want to applaud a lot of recent work that CEA-core team are doing to avoid this, the updates to effectivealtruism.org for example have helped!.
I am very keen for EA to be about the “reason and evidence” thing, rather than about specific answers. But in order to do this, I think that we need to grow cautiously (maybe around 30%/year) and in a pretty thoughtful way.
A couple of things here:
Firstly, 30% /year is pretty damn fast by most standards!
Secondly, I agree that being thoughtful is essential (that’s a key part of my central claim!).
Thirdly, some of the rate of growth is within “our” control (e.g. CEA can control how much it invests in certain community building activities). However, a lot of things aren’t. People are noticing as we ramp up activities labelled EA or even losely associated with EA.
For example, to avoid growing faster than 30% /year should someone say to Will and the team promoting WWOTF to not pull back on the promotion? What about to SBF to not support more candidates or scaling up FTX Future Fund? Should we not promote EA to new donors/GWWC members? Should GiveWell stop scaling up?
If anything associated with EA grows, it’ll trickle through to more people discovering it.
I think we need to expect that it’s not entirely within our control and to act thoughtfully in light of this.
Agree that echo chamber/dogmatism is also a major barrier to epistemics!
“30% seems high by normal standards”—yep, I guess so. But I’m excited about things like GWWC trying to grow much faster than 30%, and I think that’s possible.
Agree it’s not fully within our control, and that we might not yet be hitting 30%. I think that if we’re hitting >35% annual growth, I would begin to favour cutting back on certain sorts of outreach efforts or doing things like increasing the bar for EAG. I wouldn’t want GW/GWWC to slow down, but I would want you to begin to point fewer people to EA (at least temporarily, so that we can manage the growth). [Off the cuff take, maybe I’d change my mind on further reflection.]
+1 to this.
In fact, I think that it’s harder to get a very big (or very fast-growing) set of people to do the “reason and evidence” thing well. I think that reasoning carefully is very hard, and building a community that reasons well together is very hard.
I am very keen for EA to be about the “reason and evidence” thing, rather than about specific answers. But in order to do this, I think that we need to grow cautiously (maybe around 30%/year) and in a pretty thoughtful way.
I agree with this. I think it’s even harder to build a community that reasons well together when we come across dogmatically (and we risk cultivating an echo chamber).
Note: I do want to applaud a lot of recent work that CEA-core team are doing to avoid this, the updates to effectivealtruism.org for example have helped!.
A couple of things here:
Firstly, 30% /year is pretty damn fast by most standards!
Secondly, I agree that being thoughtful is essential (that’s a key part of my central claim!).
Thirdly, some of the rate of growth is within “our” control (e.g. CEA can control how much it invests in certain community building activities). However, a lot of things aren’t. People are noticing as we ramp up activities labelled EA or even losely associated with EA.
For example, to avoid growing faster than 30% /year should someone say to Will and the team promoting WWOTF to not pull back on the promotion? What about to SBF to not support more candidates or scaling up FTX Future Fund? Should we not promote EA to new donors/GWWC members? Should GiveWell stop scaling up?
If anything associated with EA grows, it’ll trickle through to more people discovering it.
I think we need to expect that it’s not entirely within our control and to act thoughtfully in light of this.
Agree that echo chamber/dogmatism is also a major barrier to epistemics!
“30% seems high by normal standards”—yep, I guess so. But I’m excited about things like GWWC trying to grow much faster than 30%, and I think that’s possible.
Agree it’s not fully within our control, and that we might not yet be hitting 30%. I think that if we’re hitting >35% annual growth, I would begin to favour cutting back on certain sorts of outreach efforts or doing things like increasing the bar for EAG. I wouldn’t want GW/GWWC to slow down, but I would want you to begin to point fewer people to EA (at least temporarily, so that we can manage the growth). [Off the cuff take, maybe I’d change my mind on further reflection.]
Are there estimates about current or previous growth rates?
There are some, e.g. here.