Edit: I think my below comment kind of misses the point – my main response is simply: Some people could probably do a huge amount of good by, e.g., helping increase meat alternatives R&D budgets, this seems a much bigger opportunity than increasing donations and similarly tractable, so we should focus more on that (while continuing to also increase donations).
--
Some quick thoughts:
I personally think the EA community could plausibly grow 1000-fold compared to its current size, i.e. to 2 million people, which would correspond to ~0.1% of the Western population. I think EA is unlikely to be able to attract >1% of the (Western and non-Western) population primarily because understanding EA ideas (and being into them) typically requires a scientific and prosocial/altruistic mindset, advanced education, and the right age (no younger than ~16, not old enough to be too busy with lots of other life goals). Trying to attract >1% of the population would in my view likely lead to a harmful dilution of the EA community. We should decide whether we want to grow more than 1000-fold once we’ve grown 100-fold and have more information.
Low donation rates indeed feel concerning. To me, the lack of discussion of “how can we best make ODA budgets more effective” and similar questions feels even more concerning, as the latter seems a much bigger missed opportunity.
I think lots of people can get government jobs where you can have a significant positive impact in a relevant area at some point of your career, or otherwise contribute to making governments more effective. I tentatively agree that personal donations seem more impactful than the career impact in many cases, but I don’t think it’s clear that we should overall aim to maximize donations. It could be worth doing some more research into this.
I would feel excited about a project that tries to find out why donation rates are low (lack of money? lack of room for more funding? saving to give later and make donations more well-reasoned by giving lump sums? a false perception that money won’t do much good anymore? something else?) and how we might increase them. (What’s your guess for the reasons? I’d be very interested in more discussion about this, it might be worth a separate EA Forum post if that doesn’t exist already.)
As you suggest, if the EA community doesn’t have the expertise to have a positive impact on developing-world policy, perhaps it should develop more of it. I don’t really know, but some of these jobs might not be very competitive or difficult but disproportionately impactful. Even if you just try help discontinue funding programs that don’t work, prevent budget cuts for the ones that do, and generally encourage better resource prioritization, that could be very helpful.
I think EA is unlikely to be able to attract >1% of the (Western and non-Western) population primarily because understanding EA ideas (and being into them) typically requires a scientific and prosocial/altruistic mindset, advanced education, and the right age (no younger than ~16, not old enough to be too busy with lots of other life goals). Trying to attract >1% of the population would in my view likely lead to a harmful dilution of the EA community.
Thanks for stating your view on this as I would guess this will be a crux for some.
FWIW, I’m not sure if I agree with this. I certainly agree that there is a real risk from ‘dilution’ and other risks from both too rapid growth and a too large total community size.
However, I’m most concerned about these risks if I imagine a community that’s kind of “one big blob” without much structure. But that’s not the only strategy on the table. There could also be a strategy where the total community is quite large but there is structure and diversity within the community regarding what exactly ‘being an EA’ means for people, who interacts with whom, who commands how many resources, etc.
I feel like many other professional, academic, or political communities are both quite large overall and, at least to some extent, maintain spaces that aren’t harmed by “dilution”. Perhaps most notably, consider that almost any academic discipline is huge and yet there is formal and informal structure that to some extent separates the wheat from the chaff. There is the majority of people who drops out of academia after their PhDs and the tiny majority of those who become a professor; there is the majority of papers that will never be cited or are of poor quality, and then there is the very few number of top journals; there is the majority of colleges and university where faculty is mostly busy teaching and from where we don’t expect much innovation, and the tiny fraction of research-focused top universities, etc.
I’m not saying this is clearly the way to go, or even feasible at all, for EA. But I do feel quite strongly that “we need to protect spaces for really high-quality interactions and intellectual progress” or similar—even if we buy them as assumption—does not imply it’s best to keep to total size of the community small.
Perhaps as an intuition pump, consider how the life of Ramanujan might have looked like if there hadn’t existed a maths book accessible to people in his situation, a “non-elite” university and other education accessible to someone in his situation, etc.
Yeah, these are great points. I agree that with enough structure, larger-scale growth seems possible. Basically, I agree with everything you said. I’d perhaps add that in such a world, “EA” would have a quite different meaning from how we use the term now. I also don’t quite buy the point about Ramanujan – I think “spreading the ideas widely” is different from “making the community huge”.
(Small meta nitpick: I find it confusing to call a community of 2 million people “small” – really wish we were using “very large” for 2 million and “insanely huge” for 1% of the population, or similar. Like, if someone said “Jonas wants to keep EA small”, I would feel like they were misrepresenting my opinion.)
I think “spreading the ideas widely” is different from “making the community huge”
Yeah, I think that’s an important insight I also agree with.
In an ideal world the best thing to do would be to expose everyone to some kind of “screening device” (e.g. a pitch or piece of content with a call to action at the end) which draws them into the EA community if and only if they’d make a net valuable contribution. In the actual world there is no such screening device, but I suspect we could still do more to expand the reach of “exposure to the initial ideas / basic framework of EA” while relying on self-selection and existing gatekeeping mechanisms for reducing the risk of dilution etc.
My main concern with such a strategy would actually not be that it risks dilution but that it would be more valuable once we have more of a “task Y”, i.e. something a lot of people can do. (Or some other change that would allow us to better utilize more talent.)
I personally think the EA community could plausibly grow 1000-fold compared to its current size, i.e. to 2 million people, which would correspond to ~0.1% of the Western population. I think EA is unlikely to be able to attract >1% of the (Western and non-Western) population primarily because understanding EA ideas (and being into them) typically requires a scientific and prosocial/altruistic mindset, advanced education, and the right age (no younger than ~16, not old enough to be too busy with lots of other life goals). Trying to attract >1% of the population would in my view likely lead to a harmful dilution of the EA community. We should decide whether we want to grow more than 1000-fold once we’ve grown 100-fold and have more information.
I meant this slightly differently than you interpreted it I think.
My best guess is that less than 10% of the Western population are capable of entering potentially high impact career paths and we already have plenty of people in the EA community for whom this is not possible. This can be for a variety of reasons: they are not hard-working enough, not smart enough, do not have sufficient educational credentials, are chronically ill, etc.
But maybe you think that most people in the current EA community are very well qualified to enter high impact career paths and our crux is there?
While I agree that government jobs are easier to get into than other career paths lauded as high impact in the EA Community (at least this seems to be true for the UK civil service), my impression is that I am a lot more skeptical than other EAs that government careers are a credible high impact career path. I say this as someone who has a government job. I have written a bit about this here, but my thinking on the matter is currently very much a work in progress and the linked post does not include most reasons why I feel skeptical. To me it seems like a solid argument in favour has just not been made.
I would feel excited about a project that tries to find out why donation rates are low (lack of money? lack of room for more funding? saving to give later and make donations more well-reasoned by giving lump sums? a false perception that money won’t do much good anymore? something else?) and how we might increase them. (What’s your guess for the reasons? I’d be very interested in more discussion about this, it might be worth a separate EA Forum post if that doesn’t exist already.)
I completely agree with this (and I think I have mentioned this to you before)!
I’m afraid I only have wild guesses why donation rates are low.
More generally, I’d be excited about more qualitative research into understanding what EA community members think their bottlenecks to achieving more impact are.
Thanks for clarifying – I basically agree with all of this. I particularly agree that the “government job” idea needs a lot more careful thinking and may not turn out to be as great as one might think.
I think our main disagreement might be that I think that donating large amounts effectively requires an understanding of EA ideas and altruistic dedication that only a small number of people are ever likely to develop, so I don’t see the “impact through donations” route as an unusually strong argument for doing EA messaging in a particular direction or having a very large movement. And I consider the fact that some people can have very impactful careers a pretty strong argument for emphasizing the careers angle a bit more than the donation angle (though we should keep communicating both).
(Disclaimer: Written very quickly.)
I also edited my original comment (added a paragraph at the top) to make this clearer; I think my previous comment kind of missed the point.
I personally think the EA community could plausibly grow 1000-fold compared to its current size, i.e. to 2 million people, which would correspond to ~0.1% of the Western population. I think EA is unlikely to be able to attract >1% of the (Western and non-Western) population primarily because understanding EA ideas (and being into them) typically requires a scientific and prosocial/altruistic mindset, advanced education, and the right age (no younger than ~16, not old enough to be too busy with lots of other life goals). Trying to attract >1% of the population would in my view likely lead to a harmful dilution of the EA community.
While we’re empirically investigating things, it seems like what proportion of the population seem like they could potentially be aligned with EA, might also be a high priority thing to investigate.
Edit: I think my below comment kind of misses the point – my main response is simply: Some people could probably do a huge amount of good by, e.g., helping increase meat alternatives R&D budgets, this seems a much bigger opportunity than increasing donations and similarly tractable, so we should focus more on that (while continuing to also increase donations).
--
Some quick thoughts:
I personally think the EA community could plausibly grow 1000-fold compared to its current size, i.e. to 2 million people, which would correspond to ~0.1% of the Western population. I think EA is unlikely to be able to attract >1% of the (Western and non-Western) population primarily because understanding EA ideas (and being into them) typically requires a scientific and prosocial/altruistic mindset, advanced education, and the right age (no younger than ~16, not old enough to be too busy with lots of other life goals). Trying to attract >1% of the population would in my view likely lead to a harmful dilution of the EA community. We should decide whether we want to grow more than 1000-fold once we’ve grown 100-fold and have more information.
Low donation rates indeed feel concerning. To me, the lack of discussion of “how can we best make ODA budgets more effective” and similar questions feels even more concerning, as the latter seems a much bigger missed opportunity.
I think lots of people can get government jobs where you can have a significant positive impact in a relevant area at some point of your career, or otherwise contribute to making governments more effective. I tentatively agree that personal donations seem more impactful than the career impact in many cases, but I don’t think it’s clear that we should overall aim to maximize donations. It could be worth doing some more research into this.
I would feel excited about a project that tries to find out why donation rates are low (lack of money? lack of room for more funding? saving to give later and make donations more well-reasoned by giving lump sums? a false perception that money won’t do much good anymore? something else?) and how we might increase them. (What’s your guess for the reasons? I’d be very interested in more discussion about this, it might be worth a separate EA Forum post if that doesn’t exist already.)
As you suggest, if the EA community doesn’t have the expertise to have a positive impact on developing-world policy, perhaps it should develop more of it. I don’t really know, but some of these jobs might not be very competitive or difficult but disproportionately impactful. Even if you just try help discontinue funding programs that don’t work, prevent budget cuts for the ones that do, and generally encourage better resource prioritization, that could be very helpful.
Thanks for stating your view on this as I would guess this will be a crux for some.
FWIW, I’m not sure if I agree with this. I certainly agree that there is a real risk from ‘dilution’ and other risks from both too rapid growth and a too large total community size.
However, I’m most concerned about these risks if I imagine a community that’s kind of “one big blob” without much structure. But that’s not the only strategy on the table. There could also be a strategy where the total community is quite large but there is structure and diversity within the community regarding what exactly ‘being an EA’ means for people, who interacts with whom, who commands how many resources, etc.
I feel like many other professional, academic, or political communities are both quite large overall and, at least to some extent, maintain spaces that aren’t harmed by “dilution”. Perhaps most notably, consider that almost any academic discipline is huge and yet there is formal and informal structure that to some extent separates the wheat from the chaff. There is the majority of people who drops out of academia after their PhDs and the tiny majority of those who become a professor; there is the majority of papers that will never be cited or are of poor quality, and then there is the very few number of top journals; there is the majority of colleges and university where faculty is mostly busy teaching and from where we don’t expect much innovation, and the tiny fraction of research-focused top universities, etc.
I’m not saying this is clearly the way to go, or even feasible at all, for EA. But I do feel quite strongly that “we need to protect spaces for really high-quality interactions and intellectual progress” or similar—even if we buy them as assumption—does not imply it’s best to keep to total size of the community small.
Perhaps as an intuition pump, consider how the life of Ramanujan might have looked like if there hadn’t existed a maths book accessible to people in his situation, a “non-elite” university and other education accessible to someone in his situation, etc.
Yeah, these are great points. I agree that with enough structure, larger-scale growth seems possible. Basically, I agree with everything you said. I’d perhaps add that in such a world, “EA” would have a quite different meaning from how we use the term now. I also don’t quite buy the point about Ramanujan – I think “spreading the ideas widely” is different from “making the community huge”.
(Small meta nitpick: I find it confusing to call a community of 2 million people “small” – really wish we were using “very large” for 2 million and “insanely huge” for 1% of the population, or similar. Like, if someone said “Jonas wants to keep EA small”, I would feel like they were misrepresenting my opinion.)
Yeah, I think that’s an important insight I also agree with.
In an ideal world the best thing to do would be to expose everyone to some kind of “screening device” (e.g. a pitch or piece of content with a call to action at the end) which draws them into the EA community if and only if they’d make a net valuable contribution. In the actual world there is no such screening device, but I suspect we could still do more to expand the reach of “exposure to the initial ideas / basic framework of EA” while relying on self-selection and existing gatekeeping mechanisms for reducing the risk of dilution etc.
My main concern with such a strategy would actually not be that it risks dilution but that it would be more valuable once we have more of a “task Y”, i.e. something a lot of people can do. (Or some other change that would allow us to better utilize more talent.)
I meant this slightly differently than you interpreted it I think. My best guess is that less than 10% of the Western population are capable of entering potentially high impact career paths and we already have plenty of people in the EA community for whom this is not possible. This can be for a variety of reasons: they are not hard-working enough, not smart enough, do not have sufficient educational credentials, are chronically ill, etc. But maybe you think that most people in the current EA community are very well qualified to enter high impact career paths and our crux is there?
While I agree that government jobs are easier to get into than other career paths lauded as high impact in the EA Community (at least this seems to be true for the UK civil service), my impression is that I am a lot more skeptical than other EAs that government careers are a credible high impact career path. I say this as someone who has a government job. I have written a bit about this here, but my thinking on the matter is currently very much a work in progress and the linked post does not include most reasons why I feel skeptical. To me it seems like a solid argument in favour has just not been made.
I completely agree with this (and I think I have mentioned this to you before)! I’m afraid I only have wild guesses why donation rates are low. More generally, I’d be excited about more qualitative research into understanding what EA community members think their bottlenecks to achieving more impact are.
Thanks for clarifying – I basically agree with all of this. I particularly agree that the “government job” idea needs a lot more careful thinking and may not turn out to be as great as one might think.
I think our main disagreement might be that I think that donating large amounts effectively requires an understanding of EA ideas and altruistic dedication that only a small number of people are ever likely to develop, so I don’t see the “impact through donations” route as an unusually strong argument for doing EA messaging in a particular direction or having a very large movement. And I consider the fact that some people can have very impactful careers a pretty strong argument for emphasizing the careers angle a bit more than the donation angle (though we should keep communicating both).
(Disclaimer: Written very quickly.)
I also edited my original comment (added a paragraph at the top) to make this clearer; I think my previous comment kind of missed the point.
While we’re empirically investigating things, it seems like what proportion of the population seem like they could potentially be aligned with EA, might also be a high priority thing to investigate.