The Value of Those in Effective Altruism
Summary/TL;DR: this piece offers Fermi Estimates of the value of those in EA, focusing on the distinctions between typical EA members and dedicated members (defined below). These estimates suggest that, compared to the current movement baseline, we should prioritize increasing the number of “typical” EA members and getting more non-EA people to behave like typical EA members, rather than getting typical EAs to become dedicated ones.
[Acknowledgments: Thanks to Tom Ash, Jon Behar, Ryan Carey, Denis Drescher, Michael Dickens, Stefan Schubert, Claire Zabel, Owen Cotton-Barratt, Ozzie Gooen, Linchuan Zheng, Chris Watkins, Julia Wise, Kyle Bogosian, Max Chapnick, Kaj Sotaja, Taryn East, Kathy Forth, Scott Weathers, Hunter Glenn, Alfredo Parra, William Kiely, Jay Quigley, and others who prefer to remain anonymous for looking at various draft versions of this post. Thanks to their feedback, the post underwent heavy revisions. Any remaining oversights, as well as all opinions expressed, are my responsibility.]
Introduction
There has been some discussion recently of whether the EA movement is excessively or insufficiently oriented toward getting typical EA members to become dedicated ones. The crux of this discussion, from a mechanistic movement building perspective, is whether, compared to the current baseline:
1) The EA movement should put more efforts into attracting as many value-aligned people to the EA movement as possible and keeping them in the movement, as well as getting non-EA members to behave more like typical EAs, or
2) The EA movement should more efforts into getting all of those in the EA movement to become as engaged as possible, at the expense of some people becoming disengaged due to this pressure and others not wanting to join because of the perception of high demands.
Which one will contribute most to global flourishing?
Semantics
A sub-branch of this discussion has focused on the terminology to use in describing those more or less engaged in the EA movement. My position is that the level of engagement is not binary, but lies on a spectrum, as noted here. For the sake of clarity and without claiming these are the optimal terms to use broadly, this piece will use “typical” for those who are closer to the end of the spectrum of casual engagement with EA, and “dedicated” for those who are closer to the higher pole of engagement.
Fermi Estimates
While there’s no reliable hard data available, we can do a Fermi estimate to start putting some approximate numbers at the actual resources, of time and money, each of these EA cohorts contribute. I will use numbers expressed publicly by others to minimize weighing the numbers with my own perspectives.
One highly-upvoted comment expressed “I expect 80%+ of EAs and rising to be ‘softcore’ for the foreseeable future.” This correlates with the 90/9/1% generalization, and makes prima facie sense based on the distribution of EA group organizers versus participants, etc. So let’s say that anywhere from 10 to 20 out of 100 EAs are closer to the dedicated pole, and this will continue to be the case as the movement grows.
Another highly-upvoted comment suggests that “a fully committed altruist usually accomplishes about as much as three to six people who do little beyond pledging.” This might be a more controversial claim than the previous one.
First, the comment itself was strongly advocating for focusing more on getting more dedicated EA members rather than typical ones, and thus might have potentially exaggerated the impact of dedicated EA participants. In fact, the comment was in response to a post I made and went against the sentiments I expressed in that post, which was one reason I chose these numbers, to avoid going with numbers that matched my intuitions.
Nonetheless, I think that the 3-6 times impact of dedicated EA members may not be an exaggeration, for several reasons: 1) Those closer to the dedicated pole would likely provide more resources, of time and money, to advance global flourishing than ones closer to the typical end of the spectrum; 2) Those more dedicated would be more exposed and informed about the complex issues related to EA, such as the challenges of cause prioritization and evaluating QALYs for systemic intervention and existential risk, and can contribute more to creating the memetic and organization infrastructure for the most effective EA movement; 3) On a related note, dedicated EA participants would also likely search harder for the most impactful and cost-effective places to give rather than just go with GiveWell’s top picks, multiplying the impact of their giving by supporting “weird” charities and meta-charities, etc.; 4) Those more dedicated would channel their resources of time and skills more effectively into advancing global flourishing than typical EAs, ranging from volunteer work to choosing and switching careers based on a desire to optimize global flourishing and fill talent gaps.
The attention of some readers might be drawn to EA notables such as Peter Singer, William MacAskill, Tom Ash, Jon Behar, Ryan Carey, Brian Tomasik, Kerry Vaughn, Tyler Alterman, Julia Wise, Owen Cotton-Barratt, Ozzie Gooen, and others, including frequent EA Forum participants, when evaluating dedicated EA members. Indeed, they do more good, much more good, than 3-6X times the good done by typical EA participants, through a combination of convincing many more people to do EA-aligned activities and building the infrastructure of the EA movement. Yet we should remember that such notables are atypical, and do not represent the vast majority of dedicated EA members, and their contributions fold into the overall 3-6X contributions of dedicated people. Separately, one reader of the draft version suggested we should come up with an additional term for such EA notables, such as “rock stars,” who do more than 100X as much good as a typical EA participant, and I will leave that for readers to discuss in the comments.
I hope this shows why I think the 3-6 times impact of dedicated EA members is plausible, and I plan to use it in this analysis. Using the handy Guesstimate app created by Ozzie Gooen, here is a link to a model that shows the results of this comparison. You can also see a screenshot of the model below.
Now, these are Fermi estimates, and I invite you to use this model to put in your own estimates and see what you come up with. After all, the upvoting of the comments above does not indicate that these numbers are correct in any objective sense. For instance, you might believe that the numbers for the impact of dedicated EA members are in fact exaggerated, despite the defence I provided above, or understated. It would be helpful if whatever numbers you choose come from a source outside of yourself to minimize letting personal intuitions weigh the numbers in favor of your preferred position, but use your own judgment, of course.
We should also compare the resource contributions of typical and dedicated EA participants to an ordinary member of the general public who is not value-aligned with the EA movement to get a better grasp on the value provided to improving the world by EA members. Let’s say a typical member of the general public contributes 3.5% of her/his resources to charitable causes, both time and money. By comparison, let’s say a typical EA member contributes around 10% of her/his resources, in various combinations of time and money, to charity. Being generous, we can estimate that the resources provided by non-EAs are ~100 times less impactful than that of EA participants due to the higher effectiveness of EA-endorsed charities.
Let’s compare the impact of such giving. Here is a link to a model that does so, and here is a screenshot of the model.
As you can see, the impact of typical EA participants is ~28500% more than an ordinary member of the public, and the impact of dedicated ones is ~450% more than typical ones (the likelihood of dedicated EA participants selecting better charities is included in the 3-6 greater impact). You’re welcome to plug in your own numbers to get your own Fermi estimates, of course.
Implications
1) Getting a non-EA to behave like like a typical EA member yields an increase in global flourishing per individual from ~3.5 to ~1000 utilons. This is a huge increase, one that would be worthwhile to think about in percentage terms to address scope insensitivity - ~28500%. Losing a typical EA member or not gaining one in the first place reduces global flourishing by decreasing ~1000 utilons to ~3.5, so a ~28500% reduction per person lost.
2) Moving a typical EA to a dedicated EA results in a ~450% increase in global flourishing, from ~1000 to ~4500, and a dedicated EA becoming a typical EA results in a ~450% decrease in global flourishing, from ~4500 to ~1000.
3) Since the number of dedicated EA members is capped by the number of typical ones, in order to get more dedicated ones, we have to get substantially more typical ones, around 8-9 typical EA members to get a dedicated one.
4) Typical EA members as a cohort provide substantially more total value, ~25% more, in resources of time and money contributing to EA efforts to advance global flourishing than dedicated EA members.
4A) This Fermi estimate suggests that the EA movement does not function according to the Pareto principle, with 80% of the output produced by 20% of the input. For the sake of epistemic honesty and for the purpose of movement-building, it would be beneficial to publicly acknowledge and recognize the contributions of typical EA movement members.
4B) Additionally, for the sake of epistemic honesty, it’s important to acknowledge that other ways of calculating the impact of typical and dedicated EA participants might result in different estimates, depending on what you optimize for. For example, if you personally optimize for reducing existential risk or animal suffering, it might be that dedicated EA members spend more of their resources on those areas of EA activism as opposed to the more mainstream focus of addressing global poverty.
Discussion
What other numbers would be useful?
While the numbers on the benefits of focusing on getting non-EAs to behave more like typical EA members are prima facie convincing, we need to acknowledge that there are some numbers that we don’t have that are salient, and that are currently too vague for doing a Fermi estimate.
For example, it would be great to know how many resources it takes to get a non-EA to behave like an EA member. This is an area I have some knowledge about. Let’s take exposure to an EA-themed article as an example. I published this article on The Huffington Post, which was shared widely on social media. As you’ll see from this Facebook comment on my personal page, it helped convince someone to decide to donate to effective charities. Furthermore, this comment is from someone who is the leader of a large secular group in Houston, and he thus has an impact on a number of other people. Since people rarely make actual comments, and far from all are fans of my Facebook page, we can estimate that many more made similar decisions but did not comment about it.
Another piece of evidence is that people who clicked over to the GiveDirectly website from this article I wrote donated $500, according to internal GiveDirectly stats. It is highly likely that people who clicked through and immediately donated as a result of reading the article just found out about GiveDirectly, and made a test donation as is typical for initial nonprofit donations—their lifetime value of donations will be worth much more to GiveDirectly. Likewise, the vast majority of people who will have found out about GiveDirectly from the article will take a while to donate as they research the topic and consider their donations. After all, an immediate donation from an article is pretty rare, usually people proceed much more slowly through the 4 steps of the nonprofit sales funnel and take time to consider and evaluate the nonprofit before donating.
Now, each of these articles took about 15 hours of total labor to write and edit for me, 10 hours for other EA participants collaborating with me to edit, and about 5 hours per article to place and promote.
Yet these numbers represent my labor in particular, and I specialize in promoting EA-themed effective giving to broad audiences. We need more numbers and data to get a better estimate of the average impact of such articles, and how much utility there is from those other than me working on this area. We also need numbers on the multitude of other areas of activity that can get ordinary people to behave like EA members, such as Giving Games and other activities. I would invite readers who have more familiarity with these areas of activity to provide their thoughts in comments, and also with your own models in the Guesstimate app.
Another set of numbers that we don’t have, and it would be great to have, is figuring out how many resources it takes to get a typical EA participant to become a dedicated one. In doing so, we also need to estimate how much risk there is of causing a typical EA member to leave the movement due to perception of high demands, and calculate that downside.
This is not an area I specialize in, and I hope some readers who do will leave their thoughts in comments, and also consider creating Guesstimate models of their own.
What are specific steps we can take to advance getting non-EAs to behave more like typical EAs?
1) One way of doing so is promoting EA to those who are value aligned. We should be wary of promoting the EA movement to those who are not value aligned, due to the downside of flooding the EA movement with non value-aligned people.
2) Another way is getting non-EAs to behave more like typical EAs without getting non-value aligned people into the EA movement. Promoting EA-themed effective giving ideas is one way of getting ordinary people to behave more like EAs without the downside of flooding the EA movement with non value-aligned members. For instance, if the ~3.5% of resources that non-EAs give to improving the world can be redirected to EA causes, this would result in ~3.5 utilons * ~100 increase in impact, so ~350 utilons instead of ~3.5. This is a ~10000% increase in EA-aligned efforts to advance global flourishing, regardless of whether or not the non-EA becomes value-aligned and joins the EA movement. Moreover, it is likely that if people are persuaded to act more like EA members, they will shift their values to grow more value aligned, and would eventually be ready to join the EA movement.
2A) Promoting effective giving includes publishing articles for a broad audience. Such articles have the promise of hitting a lot of people at once, but their impact is varied.
2B) Promoting effective giving to social elites who have lots of money to give. This can include a variety of strategies from promoting effective giving to niche well-off audiences, to selectively promoting pre-existing elite-giving strategies that are aligned with EA strategies. An example of the latter might be to write articles promoting well-known charities that engage in EA-like giving—the Gates Foundation, Good Ventures—while also promoting EA concepts such as data-driven giving, solving the drowning child problem, etc.
2B1) Getting elites to change their giving strategies takes more effort than getting non-elites to do so, since there are many competing for their attention and wealth, but we currently don’t have sufficient evidence to compare the trade-offs of focusing on either group. I suspect in any case we should pursue promoting EA-themed effective giving both to elites and non-elites.
2C) selectively promoting pre-existing elite-giving strategies that are aligned with EA strategies eg writing articles promoting charities that you can pick and choose from whatever the Gates foundation, zuckerberg, theil or whomever is flavour-of-the-month and giving (but only choose stuff that promotes EA-like giving).
2C) Providing materials and resources to local EA groups to enable them to promote effective giving on a local level, as well as teaching them how to do so. One project relevant to the former might be to create a marketing resource bank for all EAs to use, with materials to promote effective giving to broad audiences, and the EA movement to value-aligned people. One point relevant to the latter might be to hold workathons training EA participants on marketing and promotion.
2D) Promoting effective giving to gatekeepers and influencers who would then promote effective giving to their own audience due to mutually compatible incentives. For example, a number of organizations with affiliates who don’t have an inherent interest in EA might want to run Giving Games for other reasons, such as because they believe their affiliate members would benefit from them.
Are the negative consequences of high expectations and pressure that bad?
Now, some people might doubt that we lose EA members due to the pressure of high expectations and burnout. Yet there are many people who leave the EA movement because of the perception that they are only really welcome if they do as much as they can to contribute to EA causes.
Doing so can be exhausting and lead to burnout, as it did for me. While I did not choose to leave, many do leave the movement because of burnout. I spoke to many about this after I started sharing my story publicly a while ago.
Others leave because their circumstances change. I spoke to people who were donating 50% of their income, and then their circumstances changed – job loss, moving, other transitions – and they could not afford to do so anymore. Rather than suffer what they perceived as the stigma of not being good enough anymore, they disengaged from the movement. Others were contributing a huge amount of time to the movement during their college years, but then graduate, move, and lose that community support that kept their activism going and gave them a strong sense of purpose. Because of survivorship bias, most of those in the movement don’t see them as they participate less and less, and they fade quietly into the background, resulting in huge losses of money and time/skills for the movement.
Others choose not to join in the first place because of the high expectations, even those who are otherwise value-aligned. For instance, Taryn describes how she is value aligned with EA and does EA-themed activities. Yet she is reluctant to identify with the EA movement due to the “general unspoken feeling of ‘you’re not doing enough unless you meet our high expectations,’” as expressed by in her Facebook comments here. Or take the comments of Kaj here (who permitted me to cite him): “Datapoint—I too have felt unsure whether I’m doing enough to justifiably call myself EA. (I have both worked for and donated to MIRI, ran a birthday fundraiser for EA causes, organized an introductory EA event where I was the main speaker, and organized a few EA meetups. But my regular donations are pretty tiny and I’m not sure of how much impact the-stuff-that-I’ve-done-so-far will have in the end, so I still have occasional emotional doubts about claiming the label.)”
How many of you think Kaj should not identify with the EA movement? However, the only role models in the EA movement right now are those who are highly involved and committed. There are no real steps taken to acknowledge and celebrating typical EA members for the benefits they bring to the movement. Some easy steps to celebrate typical EA participants, and ease off the pressure to do whatever they can, will likely result in a significant overall gain for the EA movement.
What are steps we can take to address these problems?
1) We can encourage publications of articles that give typical EA members the recognition they deserve for doing so much for the movement, more than dedicated ones according to the calculations above.
2) We can invite typical EA members to speak about their experiences at the 2016 EA Global.
3) We can publish interviews with typical EAs.
4) We can feature a few typical EA members on the redesigned version of effectivealtruism.org.
5) We can pay more attention to burnout and self-care than we currently do, and highlight the importance for those trying to change the world to orient toward the long term, thinking of their civic engagement as a marathon and not a sprint.
6) We can also celebrate the people who make small steps that carry them up the spectrum of engagement: those who go from 1% to 5% and those who go from 5 to 10% of their monetary donations; those who take TLYCS or the GWWC pledge; those who increase their level of volunteering from being EA group members to EA group organizers, who start to give EA talks or host Giving Games, who start to write blog post about their EA engagement, who start to volunteer for EA meta-charities or effective direct-action charities. The key is to encourage and praise multiple small steps and multiple paths, for instance for people that don’t want to do talks but are ok with writing blog-posts and vice versa, in order that people don’t get discouraged thinking you only support a person if they choose a particular activity. It’s also good to have ways of acknowledging people that come up with new ways of doing things advancing global flourishing. The underlying logic here is to pay attention to people’s emotions, social signaling, and group dynamics. Then, provide appropriate rewards and positive reinforcements for higher engagement, without making people feel not included if they do not choose to engage more, or need to drop out and re-engage later.
7) On a related note, we can publicly promote the idea that people will shift their commitment levels of time and money to the movement as their circumstances change, and that’s ok! Heidi Overbeek is one example of a person who did so. People like Heidi should be welcomed to commit as much as they are willing and able, and not asked to keep committing at their previous level despite their change in circumstances. It’s especially easy for us dedicated EA participants to make the mistake of demanding that people stick to prior commitments due to the human brain’s vulnerability to loss aversion, but knowing about this tendency, we can avoid it.
Now, I’m not suggesting we should make most speakers typical EAs, or write most articles or conduct most interviews with them. Overall, my take is that it’s appropriate to celebrate individual EA members proportional to their labors, and as the numbers above show, dedicated EA participants individually contribute quite a bit more than typical ones. Yet we as a movement need to go against the current norm of not acknowledging typical EA members, and give them the recognition they deserve for contributing in their mass more than dedicated EA participants, and not making excessive demands on them that will cause them to leave the movement. These are just some specific steps that would help us achieve this goal.
Conclusion
For the sake of global flourishing, in comparison to the current baseline of the EA movement, it’s more valuable to:
1) Focus on attracting value-aligned people and retaining typical EA members, and decreasing the emphasis on transforming typical EA members into dedicated ones. This would both increase global flourishing by ensuring high numbers of typical EA members, and also provide the baseline population needed to get more dedicated EA participants.
2) Focus on getting members of the general public to behave more like typical EA members by promoting effective giving broadly, thus changing people’s giving behaviors and channeling their existing resources into EA-aligned causes. Over time, for some people this would result in changing values to make them EA-aligned and ready to join the movement.
- The Value of Those in Effective Altruism by 17 Feb 2016 0:59 UTC; 18 points) (LessWrong;
- Announcing “Everyday Heroes of Effective Giving” Series by 14 Jun 2016 15:30 UTC; 13 points) (
- “Everyday Heroes of Effective Giving”: Catherine Low, Jo Duyvestyn, Peter Livingstone by 24 Jul 2016 20:44 UTC; 12 points) (
- 6 Mar 2016 22:36 UTC; 7 points) 's comment on Accomplishments Open Thread—March 2016 by (
The key numbers to estimate here are:
The relative (incremental) good done by someone who dedicates their whole career to doing good vs the (incremental) good done by someone who takes a moderate interest in effective altruism (A/B).
The relative difficulty of recruiting an extra person into the former camp vs the latter (X/Y).
At that point the relative value of pursuing each approach is simple to calculate (A/B * Y/X). Of course we will always want a mixture of the two, because of i) declining returns in each approach ii) complementarities between the two.
I would spend more time running over the considerations regarding what these numbers should be. For what it’s worth, I think the 3-6x is too low, and a figure in the range of 10-100x would be reasonable. If a full-time EA simply doubles their income, gives twice as large a share of their income, and finds a charity that it twice as good as a part-time EA, that is already an 8x improvement. And that is restricted only to earning to give options.
I don’t think we need to estimate the baseline good done by someone who is not involved in EA to figure out the relative increments. Nonetheless, I think your estimates here are too high:
“As you can see, the impact of typical EA participants is ~28500% more than an ordinary member of the public, and the impact of dedicated ones is ~450% more than typical ones...”
This would imply that a dedicated EA is doing over 1000x more good for the world than an average member of the general public. I don’t think that is likely. To start with some people are doing very valuable strategic work without any involvement in EA; others end up doing a lot of good just by accident, guided by common sense, market prices, or whatever else. Furthermore, because the effect of anything on the long-term future is so unclear, many EA projects may end up doing less good than it superficially appears today. That makes such a dramatic multiple implausible on its face.
The main justification I could see for such a huge multiple would be if you thought on average people were neither making the world better or worse. In that case using a multiple for comparison is unhelpful as all you are doing is dividing by a number ~zero.
Glad to talk about the numbers! I specifically used the 3-6X as it came from outside of myself, to minimize the bias of my own personal take on the matter.
Regarding the 10-100X, I personally think that is a bit high, especially considering your latter point of “many EA projects may end up doing less good than it superficially appears.” I agree strongly with that statement. For instance, according to GiveWell’s write-up, life-saving interventions such as malaria nets in areas where access to contraception is weak are likely to lead to some acceleration of population growth, which in turn has negative consequences both for human and animal well-being that we cannot easily estimate. This is only one of many examples.
For this reason, I believe the 3-6X figure of current impact in comparing typical to dedicated EA members is closer to the truth. My take is that it’s more important for more people to be involved in the EA movement, since I have a strong belief that over time, the EA movement will figure out more optimal ways of estimating the impact of various interventions, and then we as a movement can update and shift our giving. Similarly, it’s important for more non-EAs to behave like EAs and give effectively—not as part of the movement, but influences by the memes of the movement to update their giving choices based on shifting evidence.
I think on average people are making the world slightly better, guided by various incentive structures. But on average people are not committed to making the world as good as it can get through their actions. I think this intentionality on the part of EA participants, our willingness to devote sizable resources to this area, and our willingness to update based on evidence justifies the huge multiple, regardless of the fact that some EA projects may end up doing less good than it superficially appears. I have a strong confidence in the movement getting better and better over time in choosing the best areas to prioritize and growing in structural and memetic coherence.
“I specifically used the 3-6X as it came from outside of myself, to minimize the bias of my own personal take on the matter.”
I don’t understand—how is another random person’s judgment less biased than your own? Maybe just average the two, or conduct a survey and use the average/median?
I think you are right about the difficulty of estimating flow-through effects. But that counts in favour of fill-timers and against part-timers. Full-timers are more likely to improve these estimates, and respond to arguments about them.
My enthusiasm about full-timers is partly driven by enthusiasm for existential risk over causes part-timers focus on (GiveWell recommended charities).
Regarding the good done by people interested in EA vs the general public. Let’s, as an upper bound say someone in this class is earning a typical graduate wage and gives 10% to the Against Malaria Foundation over a 40 year career. In that case their EA related activities might save ($7,000 * 40 / $3,000) 93 lives in developing countries. To make that 28500% an average person we have to think on average non-EAs only do the equivalent good over the course of their entire private and professional lives of saving 1/3rd of a life in the developing world. This is too pessimistic in my view. Members of the general public include great researchers, business-people and activists, who sometimes make huge breakthroughs that benefit us all (whether intended or not). It also includes people in professions that consistently make modest contributions to others’ lives and maintain the infrastructure that allows civilization to prosper, like teachers, farmers, doctors, social workers, police, and so on. It also includes many who simply make good choices that benefit themselves and their families.
On top of that I think saving a life (and probably improving someone’s education, etc) in a rich country is more valuable than saving one in the world’s poorest countries for two reasons: firstly, welfare in rich countries is significantly higher so the gain to the person themselves is greater; secondly, people in the rich world have more opportunities to contribute to global public goods through innovation, transfers to the disadvantaged, or simply growing the world’s stock of physical/human capital.
I would suggest an upper bound of a 100x impact gain from becoming a GWWC member.
I think I noted above in the article that the 3-6X estimate came from someone explicitly disagreeing with my sentiments expressed in a prior post, so I went with that estimate as a means of de-anchoring. I like the idea of conducting a survey, and I figured having readers put in their own numbers in comments would help address this issue. Seems to have worked well, as you gave your own estimate, and so did others in the comments. But a survey would also be quite good, something to think about—thanks!
I hear you about the likelihood that dedicated people will likely update quicker based on new information, I included that in the part of the article about the 3-6X difference, namely points 2 and 3.
I like GiveWell’s work on the Open Philanthropy Project, which I think enable it to help channel the donations of more typical EA participants to causes like existential risk
My response would be that I find your perspective a bit optimistic :-) We have to keep in mind that many non-EA participants harm rather than improve the world, for example people who produce cigarettes. While that one is obvious, I would argue that more broadly, the general consumer market has many downside, and some aspects of it harm rather than improve the world. Likewise, some great researchers and philanthropists may be contributing rather than reducing existential risk and thus harming rather than improving the world.
This leads me to challenge the idea of an upper bound of a 100x impact gain from becoming a GWWC member, as participating in the EA movement would likely correlate with people moving away from activities that actively harm the world.
I am sympathetic to the idea that on average what humans are doing is neither good nor bad, but from that perspective we are back at the divide by zero problem. As we don’t have a good sense of how good the work of a typical person is—not even whether it’s positive, neutral or negative—we should avoid using it as a unit of measure.
I generally see people as making the world slightly better. It seems that the world has become slightly better over time, in comparison to the past. So before the EA movement was around, the world was improving. This suggests a non-zero value for the “typical person.”
However, the EA movement, per individual within it, has improve the world a great deal more than the typical person, and has the potential for improving it even further as it gets more value-aligned people on board, or more non-value aligned people behaving like EA participants. So this is the value I am trying to get at with the large difference between “typical person” and “EA participant.” We can have a conversation about the numbers, of course :-)
The 3-6x number was a vague guess. I would revise it to 10x, 20x or more for averages.
Fair enough :-) What are your motivations for giving a different number? What do you think is closer to the truth and why do you think so?
I was only thinking of the amount of money moved when I said that number. By EAs being more informed about principles of cause selection and mechanisms to do good in the world, their money will go much further. Second-order long term epistemic effects of the movement being populated with more involved and serious people are difficult to quantify but probably significant.
Cool, thanks for sharing that!
I like your analysis, but the numbers here feel made up at worst and unexplained at best. It would have been better to look at real numbers rather than speculating.
The most concrete number I think we can consider are donation figures from the 2013 EA Survey. We tracked $5.2M in giving, with $2M (38% of donations) coming from the top one person (0.1% of the sample), and the top 10% of donors in our sample accounted for 90% of the donations (being in the top 10% of our sample required donating $8.7K/yr).
This would suggest that landing one big fish (e.g., Dustin Moskovitz and Good Ventures, not in our sample, but worth ~$5B) is on par with the entire rest of the EA movement from the limited donations-only perspective.
However, this does not mean that it is right to focus more on landing “big fish” as you’re right that:
(a) moving a typical $1k/yr donor from ineffective charity to AMF charity is likely higher ROI (if you believe the 1000x effectiveness multiplier) than moving a $1k/yr donor from AMF to $10k/yr to AMF (10x multiplier).
(b) the incentive effects and PR effects on the overall movement may still be negative enough.
(c) work on drawing in a large movement also helps draw in the big fish because they’re attracted to the movement size.
I’d be curious if the model could be refined or adjusted. You’re right that this contributors follow a Pareto distribution, but it’s not obvious to me what this would mean in the model. It could mean that it’s worth recruiting lots of new people because one may be a ‘big fish’, or it could mean that it’s more worth taking people who are doing very well and making sure they do better. It depends on other factors.
I guess this would help us to at least look at the model through the lens of the most high-impact (or people with ‘potential to be high impact’, whatever that may mean) people.
That said, the really big question is what the purpose of the model is. What is the decision being made? If it is being used by ‘average ea-forum visitors’ to do local work, it’s quite different from a model aimed at a particular EA org. For instance, many people may not really be able to persuade or help those in the top 3%, but they may be able to make great gains with the others.
I hear you about the numbers. As I stated above, I specifically used numbers that came from highly-upvoted comments not made by myself, to minimize the bias of my own personal take on the matter.
I actually considered using the numbers on the surveys, but I found they didn’t capture volunteer efforts. A large proportion of people who identified as EA members reported not giving any money at all. Yet since they considered themselves EA movement participants, I felt it important to avoid leaving them out. Moreover, I personally know a number of EA members who give their time and skills as opposed to money. This aligns with the recent great piece by Ben Todd about talent being more important than money.
Moreover, in alignment with Robert Wiblin’s comments, I think that a number of EA projects may end up doing less good than it may appear right now. So I put personally a higher level of value on simply being involved with the EA movement, since I have a strong belief that over time, the EA movement will figure out more optimal ways of estimating the impact of various interventions, and then we as a movement can update and shift our giving. Similarly, it’s important for more non-EAs to behave like EAs and give effectively—not as part of the movement, but influences by the memes of the movement to update their giving choices based on shifting evidence.
Agreed on points a, b, and c.
I expect time would be equally disproportionate, with the people working at EA orgs generating a huge % of the value of “EA time”. I would expect if you surveyed EA orgs there would be broad agreement about staff vs volunteer value.
Just to clarify, I’m talking about those who donate their resources of time and talents, not those who get paid for it.
I’d say the assumptions in this paragraph require much further examination. They are very far from a given, and not something that I believe many of these people would definitively claim. There’s greater uncertainty in much of what they’re doing, so much so that their impact at least has the potential to be much less than those you don’t consider ‘rock stars’.
“The attention of some readers might be drawn to EA notables such as Peter Singer, William MacAskill, Tom Ash, Jon Behar, Ryan Carey, Brian Tomasik, Kerry Vaughn, Tyler Alterman, Julia Wise, Owen Cotton-Barratt, Ozzie Gooen, and others, including frequent EA Forum participants, when evaluating dedicated EA members. Indeed, they do more good, much more good, than 3-6X times the good done by typical EA participants, through a combination of convincing many more people to do EA-aligned activities and building the infrastructure of the EA movement. Yet we should remember that such notables are atypical, and do not represent the vast majority of dedicated EA members, and their contributions fold into the overall 3-6X contributions of dedicated people. Separately, one reader of the draft version suggested we should come up with an additional term for such EA notables, such as “rock stars,” who do more than 100X as much good as a typical EA participant, and I will leave that for readers to discuss in the comments.”
That’s a fair point, and a good criticism!
I agree that there’s greater uncertainty about the positive impact in the long term of what EA “rock stars” are doing (note—my use of the term does not indicate we should adopt it, I’m just using it as a placeholder). Yet the long-term impact can be said about the EA movement as a whole, of course.
My perspective is that people who are trying to do much more good than the typical EA − 100X—think much harder about their long-term impact and try to pick the best course of action. They might disagree with each other, but they would have spent a lot of time thinking through their options and choosing the ones they seriously consider the be the best ones going forward. My educated guess is that doing so is most likely to correlate with 100X positive outcomes.
Of course, this is not certain, and I acknowledge I should have been a bit more cautious in that statement to not convey certainty. Appreciate you calling me out on it!
I find it difficult to square this argument with the observed data. The distribution of impact in EA is very skewed such that a small number of people have almost all the impact. For example, Cari Tuna’s impact is worth more than every single GWWC pledger combined. Similar things can probably be said about Holden and Eli, Eliezer, Nick Bostrom and others. For these individuals small increases in the degree of dedication that occurred early on likely resulted in huge increases in total impact.
It seems like we should expect the same phenomenon to hold true in the future. If so, increasing the dedication of those likely to be at the peak of total impact is likely to generate much more marginal impact than adding new members. Interested in your thoughts.
EAs seem to have more “impact inequality” than the population in general. For example, among charitable donations in general, about 20% (IIRC) come from the extremely wealthy, but among EAs it’s more like 70%. Or to put it another way, about 1 in 500,000 Americans are involved in EA, but 1 in 200 of the Forbes 400 are involved in EA. So you might expect EA to move closer to the population distribution as it grows.
I think this is a really important phenomenon to unpack.
On a meta-level, regarding Cari Tuna, Holden and Eli, yourself, Eliezer, Nick Bostrom, Peter Singer, etc., this is one reason I think it may be beneficial to distinguish the big notables of the EA movement from the category of “dedicated,” although I don’t think the term “rock star” I used above is optimal. Such high-achieving EA members stand out for the degree of their contribution to and impact on the movement, whether through their financial giving such as Cari Tuna and Dustin Moskovitz, through their intellectual contributions such as Peter Singer, or through their EA movement infrastructure work as the rest of the people named.
Yet we cannot predict in advance who will bring such benefits. It may be that people who are now becoming typical EA members are the ones who will be the next Cari, Eliezer, Nick, etc. For the most part, all of us were typical EA members before becoming dedicated or high-achieving ones. I know that was the case with me. Another example is Tom Ash—he told me how he got involved with the movement slowly over time, and now he is a central figure. Not sure about your own path :-)
So I think there’s a high downside to focusing on increasing the dedication of those likely to be at the peak of total impact, at the cost of placing demands for higher involvement on typical EA members. Right now, my perception is that the movement is too skewed in the direction of placing high demands for involvement, and not celebrating enough the things that typical EA members do.
Now, I think it would be quite beneficial if we can have separate messages targeted at those at the peak of total impact, encouraging them to do more because of their impact, and and those who are closer to the casual pole of EA engagement, signaling to them inclusiveness and acknowledgment. This would take some work, but is certainly doable. Audience segmentation is a well-known strategy in public communication, and this may be an appropriate place for it. In other words, have different messages targeted at different EA audiences, optimizing it to the needs of each audience.
I don’t know what we’re arguing for anymore. I may be willing to agree that given the choice between “indiscriminately cause more people to be EAs” and “indiscriminately cause existing EAs to be more dedicated” we should pick the first option.
But, this is just not the case we’re in. It was possible to predict in advance that Cari Tuna was going to be very important (she had billions of dollars). It was possible to predict in advance that Eliezer or Nick Bostrom were going to be important (they are actual geniuses). Given these predictions it makes sense to spend more time talking to them and getting them on board with the relevant ideas. So, the argument you’re making doesn’t seem to generalize anywhere.
Also, you’re assuming a particular (and unusual) intervention mechanism—moral guilt/placing demands. This doesn’t seem like the way to increase dedication to me, so I think you’re rejecting a strategy that few people are actually employing.
I’d like to think we’re having a dialectic truth-seeking, rather than an argument :-)
The argument generalizes to when we do broad outreach around effective giving, rather than talking to individuals. I very much agree we should focus in our individual conversations on converting high-value people like Cari Tuna or Nick Bostrom. On the other hand, we cannot predict in advance many high-value people, like Tom Ash, who got involved casually and slowly, and right now is a central figure in the movement.
Jon Behar brought up some additional relevant points in his comment.
I think the comments of Taryn East above exemplify that this “placing demands” is what many people experience. I have also heard from plenty of others who left EA that they left because of this perception of ratcheting demands. Now, let me be clear—this may not the explicit intention of the intervention mechanism that people are employing. Yet it doesn’t matter what intervention mechanism people think they are actually using—the key is what outcomes are occurring. If the outcome is that people experience the sensation of moral guilt/pressure, this is a big problem for the movement being welcoming and inclusive.
To help address this topic, I started a discussion on the EA FB, you might be interested in checking it out.
In the last years some research has been done on movement building and organizing. The conclusion of “Hahrie Han (2015): How Organisations Develop Activists” is that successful organisations don’t rely solely on attracting people that are already motivated and skilled but also on actively developing their activists by coaching and mentoring them from the bottom up and moving them up “the activist ladder”. Instead of organizing events the leaders in high engagement organisations “organize organizers”. The activist ladder may contain steps like I = occasional activist, II = typical activist, III = dedicated activist etc. I think a discussion whether we should concentrate on step I, II or III will remain inconclusive due to the high uncertainty of the parameters. It seems way more important to establish a (semi-)formal system, a pyramid of leaders who mainly engage in coaching and mentoring in the effective altruist movement.
I like the idea of a pyramid of coaches. EA Action already does something a bit like this, but much more work can be done.
You list some of the value of dedicated EAs as encouraging other people to do more. There’s something circular about using that number to calculate the relative value of dedicated vs. typical EAs.
Can you highlight what about it seems circular to you?
Nice post- good to see the discussion it’s generating.
While I find the first-order expected utility calcs useful, I think it’s also important to consider the direction and possible magnitude of second order effects. I think these point strongly in the favor of seeking more but less dedicated EAs due to reasons of scale.
We want the charitable marketplace to become more efficient- for more capital to flow to the highest impact projects, higher quality information, easier access to information, etc. We’re currently trapped in a negative feedback loop- donors don’t give based on impact, so charities aren’t incentivized to measure/report impact, so donors can’t find impact data, so it’s harder for donors to give based on impact, etc. The more people we have who buy into EA principles to some degree, the easier it’ll be to reverse this feedback loop. Take someone who gives 5% of their income to charity, half of it to EA causes and half to more traditional causes. If this person improves the impact of their traditional giving by doing more research or thinking about cause selection, it will barely show up in our expected utility calc but it will help propagate some of the key messages we care about.
There are also issues of optics. If every EA social media account magically gained 100x followers, the movement as a whole would appear a lot more credible.
So all else equal, if I knew that 1000 people were going to accomplish 1000 units of good, I’d much rather have that good distributed evenly rather than highly concentrated in a few people.
Good additional points for getting more value-aligned people engaged as typical EA members, and getting non-value aligned ones to behave like EA members.
I think the issue of optics is especially impactful. A movement that is perceived as having a large social impact would be more likely to draw traditional wealthy donors, as opposed to the newly-wealthy people like Dustin Moskovitz and Cari Tuna.
I think something funny is going on with the definitions here. In explaining the “typical” versus “dedicated” distinction you say the following:
Which seems to imply that “dedicated” EAs are just the top 10% of all EAs. Later you make this claim more explicit:
But, if this is the way you’re defining the terms, then it’s trivially true that we should get more typical EAs. In fact, under this definition, it’s not possible to increase the number of dedicated EAs unless you attract more typical EAs. Perhaps I’ve missed something. If so, please feel free to explain.
I should have been more clear on this topic, thanks for pointing this out!
This is an area where we need to unpack the binary-seeming nature of the typical/dedicated divide. I believe it is true that to get more dedicated EA participants, we need to get more typical EA participants within the broad spectrum ranging from the casual engagement pole up to the fuzzy typical/dedicated divide in between the casual engagement pole and the highest engagement pole.
However, the question I am focusing on in the post is comparing putting demands on typical EA members and not making them feel included if they don’t perform to demands, versus making them feel included regardless and simply rewarding higher involvement. My take is that the second strategy will overall work better for advancing the movement.
To be honest, if you don’t have a non-circular definition of the key terms, I think you should basically throw the whole argument out.
I think I’m missing something regarding the circularity of definitions. Can you clarify what about it seems circular to you?
It seems like one could ground some of these numbers in the data we have on the EA movement. For example, I would trust this more if someone looked at the EA survey data and took the top 10% of people in terms of % of income they currently donate and see what % of the total money moved to effective charities they make up. I would guess that very dedicated EAs make up almost all the money moved in the EA movement. Is this a perfect proxy for utility? No but it seems to be a lot better than numbers drawn more from intuition.
Done.
Has anyone calculated a rough estimate for the value of an undergraduate student’s hour? Assume they attend a top UK university, currently are unemployed, and plan to pursue earning to give. Thanks in advance for any info or links!