This is a discussion that has happened a few times. I do think that ‘global priorities’ has already grown as a brand enough to be seriously considered for wider use, and perhaps even as the main term for the movement.
I’d still be reluctant to ditch ‘effective altruism’ entirely. There is an important part of the original message of the movement (cf pond analogy) that’s about asking people to step up and give more (whether money or time) - questioning personal priorities/altruism. I think we’ve probably developed a healthier sense of how to balance that (‘altruism/life balance’) but it feels like ‘global priorities’ wouldn’t cover it.
This is an excellent point. I “joined” EA because of the pond idea. I found the idea of helping a lot of people with the limited funds I could spare really appealing, and it made me feel like I could make a real difference. I didn’t get into EA because of its focus on global prioritization research.
Of course, what I happened to join EA because of is not super important, but I wonder how others feel. Like EA as a “donate more to AMF and other effective charities” is a really different message than EA as “research and philosophize about what issues are really important/neglected.”
I’m not sure which EA is anymore, and changing the name to global priorities might change the movement from the Doing Good Better movement to the “Case for Strong Longtermism” movement and those are very different. But I’m very uncertain on which EA will/should end up as.
I want to push back against the idea that a name change would implicitly change the movement in a more longtermist direction (not sure you meant to suggest that, but I read that between the lines). I think a name change could quite plausibly also be very good for the global health and development and animal welfare causes. It could shift the focus from personal life choices to institutional change, which I think people aren’t thinking about enough.
The EA community would probably greatly increase its impact if it focused a bit less on personal donations and a bit more on spending ODA budgets more wisely, improving developing-world health policy, funding growth diagnostics research, vastly increasing government funding for clean meat research, etc.
The EA community would probably greatly increase its impact if it focused a bit less on personal donations and a bit more on spending ODA budgets more wisely, improving developing-world health policy, funding growth diagnostics research, vastly increasing government funding for clean meat research, etc.
I think I disagree with this given what the community currently looks like. (This might not be the best place to get into this argument, since it’s pretty far from the original points you were trying to make, but here we go.)
Two points of disagreement:
i) The EA Survey shows that current donation rates by EAs are extremely low. From this I conclude that there is way too little focus on personal donations within the EA community. That said, if we get some of the many EAs which are donating very little to work on the suggestions you mention, that is plausibly a net improvement, as the donation rates are so low anyway.
Relatedly, personal donations are one of the few things that everyone can do. In the post, you write that “The longer-term goal is for the EA community to attract highly skilled students, academics, professionals, policy-makers, etc.”, but as I understand the terms you use, this is probably less than 10% of the Western population. But maybe you disagree with that?
Accordingly, I do not view this as the longer-term goal of the EA community, but only one of them. Most of the other people who cannot have high-flying high-impact careers, which is most people, should focus on maximizing donations instead.
ii) I think the EA community currently does not have the expertise to reliably have a positive impact on developing world policy. It is extremely easy to do harm in this area. Accordingly, I am also sceptical of the idea to introduce a hits-based global development fund, though I would need to understand better with what you are intending there.
I would be very keen for the EA community to develop expertise in this area and some of the suggestions you make e.g growth diagnostics research should help with that. But we are very far from having expertise right now and should act accordingly.
Edit: I think my below comment kind of misses the point – my main response is simply: Some people could probably do a huge amount of good by, e.g., helping increase meat alternatives R&D budgets, this seems a much bigger opportunity than increasing donations and similarly tractable, so we should focus more on that (while continuing to also increase donations).
--
Some quick thoughts:
I personally think the EA community could plausibly grow 1000-fold compared to its current size, i.e. to 2 million people, which would correspond to ~0.1% of the Western population. I think EA is unlikely to be able to attract >1% of the (Western and non-Western) population primarily because understanding EA ideas (and being into them) typically requires a scientific and prosocial/altruistic mindset, advanced education, and the right age (no younger than ~16, not old enough to be too busy with lots of other life goals). Trying to attract >1% of the population would in my view likely lead to a harmful dilution of the EA community. We should decide whether we want to grow more than 1000-fold once we’ve grown 100-fold and have more information.
Low donation rates indeed feel concerning. To me, the lack of discussion of “how can we best make ODA budgets more effective” and similar questions feels even more concerning, as the latter seems a much bigger missed opportunity.
I think lots of people can get government jobs where you can have a significant positive impact in a relevant area at some point of your career, or otherwise contribute to making governments more effective. I tentatively agree that personal donations seem more impactful than the career impact in many cases, but I don’t think it’s clear that we should overall aim to maximize donations. It could be worth doing some more research into this.
I would feel excited about a project that tries to find out why donation rates are low (lack of money? lack of room for more funding? saving to give later and make donations more well-reasoned by giving lump sums? a false perception that money won’t do much good anymore? something else?) and how we might increase them. (What’s your guess for the reasons? I’d be very interested in more discussion about this, it might be worth a separate EA Forum post if that doesn’t exist already.)
As you suggest, if the EA community doesn’t have the expertise to have a positive impact on developing-world policy, perhaps it should develop more of it. I don’t really know, but some of these jobs might not be very competitive or difficult but disproportionately impactful. Even if you just try help discontinue funding programs that don’t work, prevent budget cuts for the ones that do, and generally encourage better resource prioritization, that could be very helpful.
I think EA is unlikely to be able to attract >1% of the (Western and non-Western) population primarily because understanding EA ideas (and being into them) typically requires a scientific and prosocial/altruistic mindset, advanced education, and the right age (no younger than ~16, not old enough to be too busy with lots of other life goals). Trying to attract >1% of the population would in my view likely lead to a harmful dilution of the EA community.
Thanks for stating your view on this as I would guess this will be a crux for some.
FWIW, I’m not sure if I agree with this. I certainly agree that there is a real risk from ‘dilution’ and other risks from both too rapid growth and a too large total community size.
However, I’m most concerned about these risks if I imagine a community that’s kind of “one big blob” without much structure. But that’s not the only strategy on the table. There could also be a strategy where the total community is quite large but there is structure and diversity within the community regarding what exactly ‘being an EA’ means for people, who interacts with whom, who commands how many resources, etc.
I feel like many other professional, academic, or political communities are both quite large overall and, at least to some extent, maintain spaces that aren’t harmed by “dilution”. Perhaps most notably, consider that almost any academic discipline is huge and yet there is formal and informal structure that to some extent separates the wheat from the chaff. There is the majority of people who drops out of academia after their PhDs and the tiny majority of those who become a professor; there is the majority of papers that will never be cited or are of poor quality, and then there is the very few number of top journals; there is the majority of colleges and university where faculty is mostly busy teaching and from where we don’t expect much innovation, and the tiny fraction of research-focused top universities, etc.
I’m not saying this is clearly the way to go, or even feasible at all, for EA. But I do feel quite strongly that “we need to protect spaces for really high-quality interactions and intellectual progress” or similar—even if we buy them as assumption—does not imply it’s best to keep to total size of the community small.
Perhaps as an intuition pump, consider how the life of Ramanujan might have looked like if there hadn’t existed a maths book accessible to people in his situation, a “non-elite” university and other education accessible to someone in his situation, etc.
Yeah, these are great points. I agree that with enough structure, larger-scale growth seems possible. Basically, I agree with everything you said. I’d perhaps add that in such a world, “EA” would have a quite different meaning from how we use the term now. I also don’t quite buy the point about Ramanujan – I think “spreading the ideas widely” is different from “making the community huge”.
(Small meta nitpick: I find it confusing to call a community of 2 million people “small” – really wish we were using “very large” for 2 million and “insanely huge” for 1% of the population, or similar. Like, if someone said “Jonas wants to keep EA small”, I would feel like they were misrepresenting my opinion.)
I think “spreading the ideas widely” is different from “making the community huge”
Yeah, I think that’s an important insight I also agree with.
In an ideal world the best thing to do would be to expose everyone to some kind of “screening device” (e.g. a pitch or piece of content with a call to action at the end) which draws them into the EA community if and only if they’d make a net valuable contribution. In the actual world there is no such screening device, but I suspect we could still do more to expand the reach of “exposure to the initial ideas / basic framework of EA” while relying on self-selection and existing gatekeeping mechanisms for reducing the risk of dilution etc.
My main concern with such a strategy would actually not be that it risks dilution but that it would be more valuable once we have more of a “task Y”, i.e. something a lot of people can do. (Or some other change that would allow us to better utilize more talent.)
I personally think the EA community could plausibly grow 1000-fold compared to its current size, i.e. to 2 million people, which would correspond to ~0.1% of the Western population. I think EA is unlikely to be able to attract >1% of the (Western and non-Western) population primarily because understanding EA ideas (and being into them) typically requires a scientific and prosocial/altruistic mindset, advanced education, and the right age (no younger than ~16, not old enough to be too busy with lots of other life goals). Trying to attract >1% of the population would in my view likely lead to a harmful dilution of the EA community. We should decide whether we want to grow more than 1000-fold once we’ve grown 100-fold and have more information.
I meant this slightly differently than you interpreted it I think.
My best guess is that less than 10% of the Western population are capable of entering potentially high impact career paths and we already have plenty of people in the EA community for whom this is not possible. This can be for a variety of reasons: they are not hard-working enough, not smart enough, do not have sufficient educational credentials, are chronically ill, etc.
But maybe you think that most people in the current EA community are very well qualified to enter high impact career paths and our crux is there?
While I agree that government jobs are easier to get into than other career paths lauded as high impact in the EA Community (at least this seems to be true for the UK civil service), my impression is that I am a lot more skeptical than other EAs that government careers are a credible high impact career path. I say this as someone who has a government job. I have written a bit about this here, but my thinking on the matter is currently very much a work in progress and the linked post does not include most reasons why I feel skeptical. To me it seems like a solid argument in favour has just not been made.
I would feel excited about a project that tries to find out why donation rates are low (lack of money? lack of room for more funding? saving to give later and make donations more well-reasoned by giving lump sums? a false perception that money won’t do much good anymore? something else?) and how we might increase them. (What’s your guess for the reasons? I’d be very interested in more discussion about this, it might be worth a separate EA Forum post if that doesn’t exist already.)
I completely agree with this (and I think I have mentioned this to you before)!
I’m afraid I only have wild guesses why donation rates are low.
More generally, I’d be excited about more qualitative research into understanding what EA community members think their bottlenecks to achieving more impact are.
Thanks for clarifying – I basically agree with all of this. I particularly agree that the “government job” idea needs a lot more careful thinking and may not turn out to be as great as one might think.
I think our main disagreement might be that I think that donating large amounts effectively requires an understanding of EA ideas and altruistic dedication that only a small number of people are ever likely to develop, so I don’t see the “impact through donations” route as an unusually strong argument for doing EA messaging in a particular direction or having a very large movement. And I consider the fact that some people can have very impactful careers a pretty strong argument for emphasizing the careers angle a bit more than the donation angle (though we should keep communicating both).
(Disclaimer: Written very quickly.)
I also edited my original comment (added a paragraph at the top) to make this clearer; I think my previous comment kind of missed the point.
I personally think the EA community could plausibly grow 1000-fold compared to its current size, i.e. to 2 million people, which would correspond to ~0.1% of the Western population. I think EA is unlikely to be able to attract >1% of the (Western and non-Western) population primarily because understanding EA ideas (and being into them) typically requires a scientific and prosocial/altruistic mindset, advanced education, and the right age (no younger than ~16, not old enough to be too busy with lots of other life goals). Trying to attract >1% of the population would in my view likely lead to a harmful dilution of the EA community.
While we’re empirically investigating things, it seems like what proportion of the population seem like they could potentially be aligned with EA, might also be a high priority thing to investigate.
Though I was surprised when I read the results of the first EA survey because I was expecting the majority of non-student EAs would donate 10% of their pretax income, I don’t think that saying that EA donations are extremely low is quite fair. The mean donation of EAs in the 2019 survey was 7.5%. The mean donation of Americans of pretax income is about 3.6%. However, with a significant number of EAs outside of the US giving less, the fact that many EAs are students, and the since I think that the EA mean is by person rather than weighted by donation (as the US average number is), I would guess EAs donate about 3-5 times as much as the same demographic that is not an EA. I do think that we could do better, and a lot of good could come from more donations.
I think I more or less agree with you. However, I think my point wasn’t about longtermism, but rather just the difference between the project that DGB was engaging in and the later work by MacAskill on cause prioritization. Like, one was saying, “Hey! evidence can be really helpful in doing good, and we should care about how effective the charities are that we donate to,” and the other work was a really cerebral, unintuitive piece about what we should care about, and contribute to, because of expected value reasons. And just that these are two very different projects, and it’s not obvious to me which one EA is at the moment. To use a cliche, EA has an identity crisis, maybe, and the classic EA pitch of Peter Singer and DGB and AMF is a very distinct pitch from the global prioritization one. And whichever EA decides on, it should acknowledge that these are different, regardless of which one is more or less impactful.
This is a discussion that has happened a few times. I do think that ‘global priorities’ has already grown as a brand enough to be seriously considered for wider use, and perhaps even as the main term for the movement.
I’d still be reluctant to ditch ‘effective altruism’ entirely. There is an important part of the original message of the movement (cf pond analogy) that’s about asking people to step up and give more (whether money or time) - questioning personal priorities/altruism. I think we’ve probably developed a healthier sense of how to balance that (‘altruism/life balance’) but it feels like ‘global priorities’ wouldn’t cover it.
This is an excellent point. I “joined” EA because of the pond idea. I found the idea of helping a lot of people with the limited funds I could spare really appealing, and it made me feel like I could make a real difference. I didn’t get into EA because of its focus on global prioritization research.
Of course, what I happened to join EA because of is not super important, but I wonder how others feel. Like EA as a “donate more to AMF and other effective charities” is a really different message than EA as “research and philosophize about what issues are really important/neglected.”
I’m not sure which EA is anymore, and changing the name to global priorities might change the movement from the Doing Good Better movement to the “Case for Strong Longtermism” movement and those are very different. But I’m very uncertain on which EA will/should end up as.
I want to push back against the idea that a name change would implicitly change the movement in a more longtermist direction (not sure you meant to suggest that, but I read that between the lines). I think a name change could quite plausibly also be very good for the global health and development and animal welfare causes. It could shift the focus from personal life choices to institutional change, which I think people aren’t thinking about enough.
The EA community would probably greatly increase its impact if it focused a bit less on personal donations and a bit more on spending ODA budgets more wisely, improving developing-world health policy, funding growth diagnostics research, vastly increasing government funding for clean meat research, etc.
I think I disagree with this given what the community currently looks like. (This might not be the best place to get into this argument, since it’s pretty far from the original points you were trying to make, but here we go.)
Two points of disagreement:
i) The EA Survey shows that current donation rates by EAs are extremely low. From this I conclude that there is way too little focus on personal donations within the EA community. That said, if we get some of the many EAs which are donating very little to work on the suggestions you mention, that is plausibly a net improvement, as the donation rates are so low anyway.
Relatedly, personal donations are one of the few things that everyone can do. In the post, you write that “The longer-term goal is for the EA community to attract highly skilled students, academics, professionals, policy-makers, etc.”, but as I understand the terms you use, this is probably less than 10% of the Western population. But maybe you disagree with that?
Accordingly, I do not view this as the longer-term goal of the EA community, but only one of them. Most of the other people who cannot have high-flying high-impact careers, which is most people, should focus on maximizing donations instead.
ii) I think the EA community currently does not have the expertise to reliably have a positive impact on developing world policy. It is extremely easy to do harm in this area. Accordingly, I am also sceptical of the idea to introduce a hits-based global development fund, though I would need to understand better with what you are intending there. I would be very keen for the EA community to develop expertise in this area and some of the suggestions you make e.g growth diagnostics research should help with that. But we are very far from having expertise right now and should act accordingly.
Edit: I think my below comment kind of misses the point – my main response is simply: Some people could probably do a huge amount of good by, e.g., helping increase meat alternatives R&D budgets, this seems a much bigger opportunity than increasing donations and similarly tractable, so we should focus more on that (while continuing to also increase donations).
--
Some quick thoughts:
I personally think the EA community could plausibly grow 1000-fold compared to its current size, i.e. to 2 million people, which would correspond to ~0.1% of the Western population. I think EA is unlikely to be able to attract >1% of the (Western and non-Western) population primarily because understanding EA ideas (and being into them) typically requires a scientific and prosocial/altruistic mindset, advanced education, and the right age (no younger than ~16, not old enough to be too busy with lots of other life goals). Trying to attract >1% of the population would in my view likely lead to a harmful dilution of the EA community. We should decide whether we want to grow more than 1000-fold once we’ve grown 100-fold and have more information.
Low donation rates indeed feel concerning. To me, the lack of discussion of “how can we best make ODA budgets more effective” and similar questions feels even more concerning, as the latter seems a much bigger missed opportunity.
I think lots of people can get government jobs where you can have a significant positive impact in a relevant area at some point of your career, or otherwise contribute to making governments more effective. I tentatively agree that personal donations seem more impactful than the career impact in many cases, but I don’t think it’s clear that we should overall aim to maximize donations. It could be worth doing some more research into this.
I would feel excited about a project that tries to find out why donation rates are low (lack of money? lack of room for more funding? saving to give later and make donations more well-reasoned by giving lump sums? a false perception that money won’t do much good anymore? something else?) and how we might increase them. (What’s your guess for the reasons? I’d be very interested in more discussion about this, it might be worth a separate EA Forum post if that doesn’t exist already.)
As you suggest, if the EA community doesn’t have the expertise to have a positive impact on developing-world policy, perhaps it should develop more of it. I don’t really know, but some of these jobs might not be very competitive or difficult but disproportionately impactful. Even if you just try help discontinue funding programs that don’t work, prevent budget cuts for the ones that do, and generally encourage better resource prioritization, that could be very helpful.
Thanks for stating your view on this as I would guess this will be a crux for some.
FWIW, I’m not sure if I agree with this. I certainly agree that there is a real risk from ‘dilution’ and other risks from both too rapid growth and a too large total community size.
However, I’m most concerned about these risks if I imagine a community that’s kind of “one big blob” without much structure. But that’s not the only strategy on the table. There could also be a strategy where the total community is quite large but there is structure and diversity within the community regarding what exactly ‘being an EA’ means for people, who interacts with whom, who commands how many resources, etc.
I feel like many other professional, academic, or political communities are both quite large overall and, at least to some extent, maintain spaces that aren’t harmed by “dilution”. Perhaps most notably, consider that almost any academic discipline is huge and yet there is formal and informal structure that to some extent separates the wheat from the chaff. There is the majority of people who drops out of academia after their PhDs and the tiny majority of those who become a professor; there is the majority of papers that will never be cited or are of poor quality, and then there is the very few number of top journals; there is the majority of colleges and university where faculty is mostly busy teaching and from where we don’t expect much innovation, and the tiny fraction of research-focused top universities, etc.
I’m not saying this is clearly the way to go, or even feasible at all, for EA. But I do feel quite strongly that “we need to protect spaces for really high-quality interactions and intellectual progress” or similar—even if we buy them as assumption—does not imply it’s best to keep to total size of the community small.
Perhaps as an intuition pump, consider how the life of Ramanujan might have looked like if there hadn’t existed a maths book accessible to people in his situation, a “non-elite” university and other education accessible to someone in his situation, etc.
Yeah, these are great points. I agree that with enough structure, larger-scale growth seems possible. Basically, I agree with everything you said. I’d perhaps add that in such a world, “EA” would have a quite different meaning from how we use the term now. I also don’t quite buy the point about Ramanujan – I think “spreading the ideas widely” is different from “making the community huge”.
(Small meta nitpick: I find it confusing to call a community of 2 million people “small” – really wish we were using “very large” for 2 million and “insanely huge” for 1% of the population, or similar. Like, if someone said “Jonas wants to keep EA small”, I would feel like they were misrepresenting my opinion.)
Yeah, I think that’s an important insight I also agree with.
In an ideal world the best thing to do would be to expose everyone to some kind of “screening device” (e.g. a pitch or piece of content with a call to action at the end) which draws them into the EA community if and only if they’d make a net valuable contribution. In the actual world there is no such screening device, but I suspect we could still do more to expand the reach of “exposure to the initial ideas / basic framework of EA” while relying on self-selection and existing gatekeeping mechanisms for reducing the risk of dilution etc.
My main concern with such a strategy would actually not be that it risks dilution but that it would be more valuable once we have more of a “task Y”, i.e. something a lot of people can do. (Or some other change that would allow us to better utilize more talent.)
I meant this slightly differently than you interpreted it I think. My best guess is that less than 10% of the Western population are capable of entering potentially high impact career paths and we already have plenty of people in the EA community for whom this is not possible. This can be for a variety of reasons: they are not hard-working enough, not smart enough, do not have sufficient educational credentials, are chronically ill, etc. But maybe you think that most people in the current EA community are very well qualified to enter high impact career paths and our crux is there?
While I agree that government jobs are easier to get into than other career paths lauded as high impact in the EA Community (at least this seems to be true for the UK civil service), my impression is that I am a lot more skeptical than other EAs that government careers are a credible high impact career path. I say this as someone who has a government job. I have written a bit about this here, but my thinking on the matter is currently very much a work in progress and the linked post does not include most reasons why I feel skeptical. To me it seems like a solid argument in favour has just not been made.
I completely agree with this (and I think I have mentioned this to you before)! I’m afraid I only have wild guesses why donation rates are low. More generally, I’d be excited about more qualitative research into understanding what EA community members think their bottlenecks to achieving more impact are.
Thanks for clarifying – I basically agree with all of this. I particularly agree that the “government job” idea needs a lot more careful thinking and may not turn out to be as great as one might think.
I think our main disagreement might be that I think that donating large amounts effectively requires an understanding of EA ideas and altruistic dedication that only a small number of people are ever likely to develop, so I don’t see the “impact through donations” route as an unusually strong argument for doing EA messaging in a particular direction or having a very large movement. And I consider the fact that some people can have very impactful careers a pretty strong argument for emphasizing the careers angle a bit more than the donation angle (though we should keep communicating both).
(Disclaimer: Written very quickly.)
I also edited my original comment (added a paragraph at the top) to make this clearer; I think my previous comment kind of missed the point.
While we’re empirically investigating things, it seems like what proportion of the population seem like they could potentially be aligned with EA, might also be a high priority thing to investigate.
Though I was surprised when I read the results of the first EA survey because I was expecting the majority of non-student EAs would donate 10% of their pretax income, I don’t think that saying that EA donations are extremely low is quite fair. The mean donation of EAs in the 2019 survey was 7.5%. The mean donation of Americans of pretax income is about 3.6%. However, with a significant number of EAs outside of the US giving less, the fact that many EAs are students, and the since I think that the EA mean is by person rather than weighted by donation (as the US average number is), I would guess EAs donate about 3-5 times as much as the same demographic that is not an EA. I do think that we could do better, and a lot of good could come from more donations.
I’m all for focusing on the power of policy, but I’m not sure giving up any of our positions on personal donations will help get us there.
I think I more or less agree with you. However, I think my point wasn’t about longtermism, but rather just the difference between the project that DGB was engaging in and the later work by MacAskill on cause prioritization. Like, one was saying, “Hey! evidence can be really helpful in doing good, and we should care about how effective the charities are that we donate to,” and the other work was a really cerebral, unintuitive piece about what we should care about, and contribute to, because of expected value reasons. And just that these are two very different projects, and it’s not obvious to me which one EA is at the moment. To use a cliche, EA has an identity crisis, maybe, and the classic EA pitch of Peter Singer and DGB and AMF is a very distinct pitch from the global prioritization one. And whichever EA decides on, it should acknowledge that these are different, regardless of which one is more or less impactful.