there isn’t even an organisation dedicated to growing the movement
Things that are not movements:
Academic physics
Successful startups
The rationality community
They all need to grow to some extent, but they have a particular goal that is not generic ‘growth’. Most ‘movements’ are primarily looking for something like political power, and I think that’s a pretty bad goal to optimise for. It’s the perennial offer to all communities that scale: “try to grab political power”. I’m quite happy to continue being for something other than that.
Regarding the size of the rationality and EA communities right now, this doesn’t really seem to me like a key metric? A more important variable is whether you have infrastructure that sustains quality at the scale the community is at.
The standard YC advice says the best companies stay small long. An example of Paul Graham saying it is here, search “I may be an extremist, but I think hiring people is the worst thing a company can do.”
There are many startups that have 500 million dollars and 100 employees more than your startup, but don’t actually have a product-market fit, and are going to crash next year. Whereas you might work for 5-10 years then have a product that can scale to several billions of dollars of value. Again, scaling right now will seems shiny and appealing, but something you often should fight against.
Regarding growth in the rationality community, I think a scientific field is a useful analogue. And if I told you I’d started some new field and in the first 20 years I’d gotten a research group in every university, is this necessarily good? Am I machine learning? Am I bioethics? I bet all the fields that hit the worst of the replication crisis have experienced fast growth at some point in the past 50 years. Regardless of intentions, the infrastructure matters, and it’s not hard to simply make the world worse.
Other thoughts: I agree that the rationality project has resulted in a number of top people working on AI x-risk, effective altruism, and related projects, and that the ideas produced a lot of the epistemic bedrock for the community to be successful at noticing important and new ideas. I am also sad there hasn’t been better internal infrastructure built in the past few years. As Oli Habryka said downthread (amongst some other important points), the org I work at that built the new LessWrong (and AI Alignment Forum and EA Forum, which is evidence for your ‘rationalists work on AI and EA claim’ ;) ) is primarily trying to build community infrastructure.
Meta thoughts: I really liked the OP, it concisely brought up a relevant proposal and placed it clearly in the EA frame (pareto principle, heavy tailed outcomes, etc).
I think it is easy to grow too early, and I think that many of the naive ways of putting effort into growth would be net negative compared to the counterfactual (somewhat analagous to a company that quickly makes 1 million when it might’ve made 1 billion).
Focusing on actually making more progress with the existing people, by building more tools for them to coordinate and collaborate, seems to me the current marginal best use of resources for the community.
(I agree that effort should be spent improving the community, I just think ‘size’ isn’t the right dimension to improve.)
Things that are not movements:
Academic physics
Successful startups
The rationality community
They all need to grow to some extent, but they have a particular goal that is not generic ‘growth’. Most ‘movements’ are primarily looking for something like political power, and I think that’s a pretty bad goal to optimise for. It’s the perennial offer to all communities that scale: “try to grab political power”. I’m quite happy to continue being for something other than that.
Regarding the size of the rationality and EA communities right now, this doesn’t really seem to me like a key metric? A more important variable is whether you have infrastructure that sustains quality at the scale the community is at.
The standard YC advice says the best companies stay small long. An example of Paul Graham saying it is here, search “I may be an extremist, but I think hiring people is the worst thing a company can do.”
There are many startups that have 500 million dollars and 100 employees more than your startup, but don’t actually have a product-market fit, and are going to crash next year. Whereas you might work for 5-10 years then have a product that can scale to several billions of dollars of value. Again, scaling right now will seems shiny and appealing, but something you often should fight against.
Regarding growth in the rationality community, I think a scientific field is a useful analogue. And if I told you I’d started some new field and in the first 20 years I’d gotten a research group in every university, is this necessarily good? Am I machine learning? Am I bioethics? I bet all the fields that hit the worst of the replication crisis have experienced fast growth at some point in the past 50 years. Regardless of intentions, the infrastructure matters, and it’s not hard to simply make the world worse.
Other thoughts: I agree that the rationality project has resulted in a number of top people working on AI x-risk, effective altruism, and related projects, and that the ideas produced a lot of the epistemic bedrock for the community to be successful at noticing important and new ideas. I am also sad there hasn’t been better internal infrastructure built in the past few years. As Oli Habryka said downthread (amongst some other important points), the org I work at that built the new LessWrong (and AI Alignment Forum and EA Forum, which is evidence for your ‘rationalists work on AI and EA claim’ ;) ) is primarily trying to build community infrastructure.
Meta thoughts: I really liked the OP, it concisely brought up a relevant proposal and placed it clearly in the EA frame (pareto principle, heavy tailed outcomes, etc).
The size of the rationality community hasn’t been limited so much by quality concerns, as by lack of effort expended in growth.
I think it is easy to grow too early, and I think that many of the naive ways of putting effort into growth would be net negative compared to the counterfactual (somewhat analagous to a company that quickly makes 1 million when it might’ve made 1 billion).
Focusing on actually making more progress with the existing people, by building more tools for them to coordinate and collaborate, seems to me the current marginal best use of resources for the community.
(I agree that effort should be spent improving the community, I just think ‘size’ isn’t the right dimension to improve.)
Added: I suppose I should link back to my own post on the costs of coordinating at scale.