The relative (incremental) good done by someone who dedicates their whole career to doing good vs the (incremental) good done by someone who takes a moderate interest in effective altruism (A/B).
The relative difficulty of recruiting an extra person into the former camp vs the latter (X/Y).
At that point the relative value of pursuing each approach is simple to calculate (A/B * Y/X). Of course we will always want a mixture of the two, because of i) declining returns in each approach ii) complementarities between the two.
I would spend more time running over the considerations regarding what these numbers should be. For what it’s worth, I think the 3-6x is too low, and a figure in the range of 10-100x would be reasonable. If a full-time EA simply doubles their income, gives twice as large a share of their income, and finds a charity that it twice as good as a part-time EA, that is already an 8x improvement. And that is restricted only to earning to give options.
I don’t think we need to estimate the baseline good done by someone who is not involved in EA to figure out the relative increments. Nonetheless, I think your estimates here are too high:
“As you can see, the impact of typical EA participants is ~28500% more than an ordinary member of the public, and the impact of dedicated ones is ~450% more than typical ones...”
This would imply that a dedicated EA is doing over 1000x more good for the world than an average member of the general public. I don’t think that is likely. To start with some people are doing very valuable strategic work without any involvement in EA; others end up doing a lot of good just by accident, guided by common sense, market prices, or whatever else. Furthermore, because the effect of anything on the long-term future is so unclear, many EA projects may end up doing less good than it superficially appears today. That makes such a dramatic multiple implausible on its face.
The main justification I could see for such a huge multiple would be if you thought on average people were neither making the world better or worse. In that case using a multiple for comparison is unhelpful as all you are doing is dividing by a number ~zero.
Glad to talk about the numbers! I specifically used the 3-6X as it came from outside of myself, to minimize the bias of my own personal take on the matter.
Regarding the 10-100X, I personally think that is a bit high, especially considering your latter point of “many EA projects may end up doing less good than it superficially appears.” I agree strongly with that statement. For instance, according to GiveWell’s write-up, life-saving interventions such as malaria nets in areas where access to contraception is weak are likely to lead to some acceleration of population growth, which in turn has negative consequences both for human and animal well-being that we cannot easily estimate. This is only one of many examples.
For this reason, I believe the 3-6X figure of current impact in comparing typical to dedicated EA members is closer to the truth. My take is that it’s more important for more people to be involved in the EA movement, since I have a strong belief that over time, the EA movement will figure out more optimal ways of estimating the impact of various interventions, and then we as a movement can update and shift our giving. Similarly, it’s important for more non-EAs to behave like EAs and give effectively—not as part of the movement, but influences by the memes of the movement to update their giving choices based on shifting evidence.
I think on average people are making the world slightly better, guided by various incentive structures. But on average people are not committed to making the world as good as it can get through their actions. I think this intentionality on the part of EA participants, our willingness to devote sizable resources to this area, and our willingness to update based on evidence justifies the huge multiple, regardless of the fact that some EA projects may end up doing less good than it superficially appears. I have a strong confidence in the movement getting better and better over time in choosing the best areas to prioritize and growing in structural and memetic coherence.
“I specifically used the 3-6X as it came from outside of myself, to minimize the bias of my own personal take on the matter.”
I don’t understand—how is another random person’s judgment less biased than your own? Maybe just average the two, or conduct a survey and use the average/median?
I think you are right about the difficulty of estimating flow-through effects. But that counts in favour of fill-timers and against part-timers. Full-timers are more likely to improve these estimates, and respond to arguments about them.
My enthusiasm about full-timers is partly driven by enthusiasm for existential risk over causes part-timers focus on (GiveWell recommended charities).
Regarding the good done by people interested in EA vs the general public. Let’s, as an upper bound say someone in this class is earning a typical graduate wage and gives 10% to the Against Malaria Foundation over a 40 year career. In that case their EA related activities might save ($7,000 * 40 / $3,000) 93 lives in developing countries. To make that 28500% an average person we have to think on average non-EAs only do the equivalent good over the course of their entire private and professional lives of saving 1/3rd of a life in the developing world. This is too pessimistic in my view. Members of the general public include great researchers, business-people and activists, who sometimes make huge breakthroughs that benefit us all (whether intended or not). It also includes people in professions that consistently make modest contributions to others’ lives and maintain the infrastructure that allows civilization to prosper, like teachers, farmers, doctors, social workers, police, and so on. It also includes many who simply make good choices that benefit themselves and their families.
On top of that I think saving a life (and probably improving someone’s education, etc) in a rich country is more valuable than saving one in the world’s poorest countries for two reasons: firstly, welfare in rich countries is significantly higher so the gain to the person themselves is greater; secondly, people in the rich world have more opportunities to contribute to global public goods through innovation, transfers to the disadvantaged, or simply growing the world’s stock of physical/human capital.
I would suggest an upper bound of a 100x impact gain from becoming a GWWC member.
I don’t understand—how is another random person’s judgment less biased than your own? Maybe just average the two, or conduct a survey and use the average/median?
I think I noted above in the article that the 3-6X estimate came from someone explicitly disagreeing with my sentiments expressed in a prior post, so I went with that estimate as a means of de-anchoring. I like the idea of conducting a survey, and I figured having readers put in their own numbers in comments would help address this issue. Seems to have worked well, as you gave your own estimate, and so did others in the comments. But a survey would also be quite good, something to think about—thanks!
I think you are right about the difficulty of estimating flow-through effects. But that counts in favour of fill-timers and against part-timers. Full-timers are more likely to improve these estimates, and respond to arguments about them.
I hear you about the likelihood that dedicated people will likely update quicker based on new information, I included that in the part of the article about the 3-6X difference, namely points 2 and 3.
My enthusiasm about full-timers is partly driven by enthusiasm for existential risk over causes part-timers focus on (GiveWell recommended charities).
I like GiveWell’s work on the Open Philanthropy Project, which I think enable it to help channel the donations of more typical EA participants to causes like existential risk
This is too pessimistic in my view. Members of the general public include great researchers, business-people and activists, who sometimes make huge breakthroughs that benefit us all (whether intended or not). It also includes people in professions that consistently make modest contributions to others’ lives and maintain the infrastructure that allows civilization to prosper, like teachers, farmers, doctors, social workers, police, and so on. It also includes many who simply make good choices that benefit themselves and their families.
My response would be that I find your perspective a bit optimistic :-) We have to keep in mind that many non-EA participants harm rather than improve the world, for example people who produce cigarettes. While that one is obvious, I would argue that more broadly, the general consumer market has many downside, and some aspects of it harm rather than improve the world. Likewise, some great researchers and philanthropists may be contributing rather than reducing existential risk and thus harming rather than improving the world.
This leads me to challenge the idea of an upper bound of a 100x impact gain from becoming a GWWC member, as participating in the EA movement would likely correlate with people moving away from activities that actively harm the world.
I am sympathetic to the idea that on average what humans are doing is neither good nor bad, but from that perspective we are back at the divide by zero problem. As we don’t have a good sense of how good the work of a typical person is—not even whether it’s positive, neutral or negative—we should avoid using it as a unit of measure.
I generally see people as making the world slightly better. It seems that the world has become slightly better over time, in comparison to the past. So before the EA movement was around, the world was improving. This suggests a non-zero value for the “typical person.”
However, the EA movement, per individual within it, has improve the world a great deal more than the typical person, and has the potential for improving it even further as it gets more value-aligned people on board, or more non-value aligned people behaving like EA participants. So this is the value I am trying to get at with the large difference between “typical person” and “EA participant.” We can have a conversation about the numbers, of course :-)
I was only thinking of the amount of money moved when I said that number. By EAs being more informed about principles of cause selection and mechanisms to do good in the world, their money will go much further. Second-order long term epistemic effects of the movement being populated with more involved and serious people are difficult to quantify but probably significant.
The key numbers to estimate here are:
The relative (incremental) good done by someone who dedicates their whole career to doing good vs the (incremental) good done by someone who takes a moderate interest in effective altruism (A/B).
The relative difficulty of recruiting an extra person into the former camp vs the latter (X/Y).
At that point the relative value of pursuing each approach is simple to calculate (A/B * Y/X). Of course we will always want a mixture of the two, because of i) declining returns in each approach ii) complementarities between the two.
I would spend more time running over the considerations regarding what these numbers should be. For what it’s worth, I think the 3-6x is too low, and a figure in the range of 10-100x would be reasonable. If a full-time EA simply doubles their income, gives twice as large a share of their income, and finds a charity that it twice as good as a part-time EA, that is already an 8x improvement. And that is restricted only to earning to give options.
I don’t think we need to estimate the baseline good done by someone who is not involved in EA to figure out the relative increments. Nonetheless, I think your estimates here are too high:
“As you can see, the impact of typical EA participants is ~28500% more than an ordinary member of the public, and the impact of dedicated ones is ~450% more than typical ones...”
This would imply that a dedicated EA is doing over 1000x more good for the world than an average member of the general public. I don’t think that is likely. To start with some people are doing very valuable strategic work without any involvement in EA; others end up doing a lot of good just by accident, guided by common sense, market prices, or whatever else. Furthermore, because the effect of anything on the long-term future is so unclear, many EA projects may end up doing less good than it superficially appears today. That makes such a dramatic multiple implausible on its face.
The main justification I could see for such a huge multiple would be if you thought on average people were neither making the world better or worse. In that case using a multiple for comparison is unhelpful as all you are doing is dividing by a number ~zero.
Glad to talk about the numbers! I specifically used the 3-6X as it came from outside of myself, to minimize the bias of my own personal take on the matter.
Regarding the 10-100X, I personally think that is a bit high, especially considering your latter point of “many EA projects may end up doing less good than it superficially appears.” I agree strongly with that statement. For instance, according to GiveWell’s write-up, life-saving interventions such as malaria nets in areas where access to contraception is weak are likely to lead to some acceleration of population growth, which in turn has negative consequences both for human and animal well-being that we cannot easily estimate. This is only one of many examples.
For this reason, I believe the 3-6X figure of current impact in comparing typical to dedicated EA members is closer to the truth. My take is that it’s more important for more people to be involved in the EA movement, since I have a strong belief that over time, the EA movement will figure out more optimal ways of estimating the impact of various interventions, and then we as a movement can update and shift our giving. Similarly, it’s important for more non-EAs to behave like EAs and give effectively—not as part of the movement, but influences by the memes of the movement to update their giving choices based on shifting evidence.
I think on average people are making the world slightly better, guided by various incentive structures. But on average people are not committed to making the world as good as it can get through their actions. I think this intentionality on the part of EA participants, our willingness to devote sizable resources to this area, and our willingness to update based on evidence justifies the huge multiple, regardless of the fact that some EA projects may end up doing less good than it superficially appears. I have a strong confidence in the movement getting better and better over time in choosing the best areas to prioritize and growing in structural and memetic coherence.
“I specifically used the 3-6X as it came from outside of myself, to minimize the bias of my own personal take on the matter.”
I don’t understand—how is another random person’s judgment less biased than your own? Maybe just average the two, or conduct a survey and use the average/median?
I think you are right about the difficulty of estimating flow-through effects. But that counts in favour of fill-timers and against part-timers. Full-timers are more likely to improve these estimates, and respond to arguments about them.
My enthusiasm about full-timers is partly driven by enthusiasm for existential risk over causes part-timers focus on (GiveWell recommended charities).
Regarding the good done by people interested in EA vs the general public. Let’s, as an upper bound say someone in this class is earning a typical graduate wage and gives 10% to the Against Malaria Foundation over a 40 year career. In that case their EA related activities might save ($7,000 * 40 / $3,000) 93 lives in developing countries. To make that 28500% an average person we have to think on average non-EAs only do the equivalent good over the course of their entire private and professional lives of saving 1/3rd of a life in the developing world. This is too pessimistic in my view. Members of the general public include great researchers, business-people and activists, who sometimes make huge breakthroughs that benefit us all (whether intended or not). It also includes people in professions that consistently make modest contributions to others’ lives and maintain the infrastructure that allows civilization to prosper, like teachers, farmers, doctors, social workers, police, and so on. It also includes many who simply make good choices that benefit themselves and their families.
On top of that I think saving a life (and probably improving someone’s education, etc) in a rich country is more valuable than saving one in the world’s poorest countries for two reasons: firstly, welfare in rich countries is significantly higher so the gain to the person themselves is greater; secondly, people in the rich world have more opportunities to contribute to global public goods through innovation, transfers to the disadvantaged, or simply growing the world’s stock of physical/human capital.
I would suggest an upper bound of a 100x impact gain from becoming a GWWC member.
I think I noted above in the article that the 3-6X estimate came from someone explicitly disagreeing with my sentiments expressed in a prior post, so I went with that estimate as a means of de-anchoring. I like the idea of conducting a survey, and I figured having readers put in their own numbers in comments would help address this issue. Seems to have worked well, as you gave your own estimate, and so did others in the comments. But a survey would also be quite good, something to think about—thanks!
I hear you about the likelihood that dedicated people will likely update quicker based on new information, I included that in the part of the article about the 3-6X difference, namely points 2 and 3.
I like GiveWell’s work on the Open Philanthropy Project, which I think enable it to help channel the donations of more typical EA participants to causes like existential risk
My response would be that I find your perspective a bit optimistic :-) We have to keep in mind that many non-EA participants harm rather than improve the world, for example people who produce cigarettes. While that one is obvious, I would argue that more broadly, the general consumer market has many downside, and some aspects of it harm rather than improve the world. Likewise, some great researchers and philanthropists may be contributing rather than reducing existential risk and thus harming rather than improving the world.
This leads me to challenge the idea of an upper bound of a 100x impact gain from becoming a GWWC member, as participating in the EA movement would likely correlate with people moving away from activities that actively harm the world.
I am sympathetic to the idea that on average what humans are doing is neither good nor bad, but from that perspective we are back at the divide by zero problem. As we don’t have a good sense of how good the work of a typical person is—not even whether it’s positive, neutral or negative—we should avoid using it as a unit of measure.
I generally see people as making the world slightly better. It seems that the world has become slightly better over time, in comparison to the past. So before the EA movement was around, the world was improving. This suggests a non-zero value for the “typical person.”
However, the EA movement, per individual within it, has improve the world a great deal more than the typical person, and has the potential for improving it even further as it gets more value-aligned people on board, or more non-value aligned people behaving like EA participants. So this is the value I am trying to get at with the large difference between “typical person” and “EA participant.” We can have a conversation about the numbers, of course :-)
The 3-6x number was a vague guess. I would revise it to 10x, 20x or more for averages.
Fair enough :-) What are your motivations for giving a different number? What do you think is closer to the truth and why do you think so?
I was only thinking of the amount of money moved when I said that number. By EAs being more informed about principles of cause selection and mechanisms to do good in the world, their money will go much further. Second-order long term epistemic effects of the movement being populated with more involved and serious people are difficult to quantify but probably significant.
Cool, thanks for sharing that!