I don’t understand—how is another random person’s judgment less biased than your own? Maybe just average the two, or conduct a survey and use the average/median?
I think I noted above in the article that the 3-6X estimate came from someone explicitly disagreeing with my sentiments expressed in a prior post, so I went with that estimate as a means of de-anchoring. I like the idea of conducting a survey, and I figured having readers put in their own numbers in comments would help address this issue. Seems to have worked well, as you gave your own estimate, and so did others in the comments. But a survey would also be quite good, something to think about—thanks!
I think you are right about the difficulty of estimating flow-through effects. But that counts in favour of fill-timers and against part-timers. Full-timers are more likely to improve these estimates, and respond to arguments about them.
I hear you about the likelihood that dedicated people will likely update quicker based on new information, I included that in the part of the article about the 3-6X difference, namely points 2 and 3.
My enthusiasm about full-timers is partly driven by enthusiasm for existential risk over causes part-timers focus on (GiveWell recommended charities).
I like GiveWell’s work on the Open Philanthropy Project, which I think enable it to help channel the donations of more typical EA participants to causes like existential risk
This is too pessimistic in my view. Members of the general public include great researchers, business-people and activists, who sometimes make huge breakthroughs that benefit us all (whether intended or not). It also includes people in professions that consistently make modest contributions to others’ lives and maintain the infrastructure that allows civilization to prosper, like teachers, farmers, doctors, social workers, police, and so on. It also includes many who simply make good choices that benefit themselves and their families.
My response would be that I find your perspective a bit optimistic :-) We have to keep in mind that many non-EA participants harm rather than improve the world, for example people who produce cigarettes. While that one is obvious, I would argue that more broadly, the general consumer market has many downside, and some aspects of it harm rather than improve the world. Likewise, some great researchers and philanthropists may be contributing rather than reducing existential risk and thus harming rather than improving the world.
This leads me to challenge the idea of an upper bound of a 100x impact gain from becoming a GWWC member, as participating in the EA movement would likely correlate with people moving away from activities that actively harm the world.
I am sympathetic to the idea that on average what humans are doing is neither good nor bad, but from that perspective we are back at the divide by zero problem. As we don’t have a good sense of how good the work of a typical person is—not even whether it’s positive, neutral or negative—we should avoid using it as a unit of measure.
I generally see people as making the world slightly better. It seems that the world has become slightly better over time, in comparison to the past. So before the EA movement was around, the world was improving. This suggests a non-zero value for the “typical person.”
However, the EA movement, per individual within it, has improve the world a great deal more than the typical person, and has the potential for improving it even further as it gets more value-aligned people on board, or more non-value aligned people behaving like EA participants. So this is the value I am trying to get at with the large difference between “typical person” and “EA participant.” We can have a conversation about the numbers, of course :-)
I was only thinking of the amount of money moved when I said that number. By EAs being more informed about principles of cause selection and mechanisms to do good in the world, their money will go much further. Second-order long term epistemic effects of the movement being populated with more involved and serious people are difficult to quantify but probably significant.
I think I noted above in the article that the 3-6X estimate came from someone explicitly disagreeing with my sentiments expressed in a prior post, so I went with that estimate as a means of de-anchoring. I like the idea of conducting a survey, and I figured having readers put in their own numbers in comments would help address this issue. Seems to have worked well, as you gave your own estimate, and so did others in the comments. But a survey would also be quite good, something to think about—thanks!
I hear you about the likelihood that dedicated people will likely update quicker based on new information, I included that in the part of the article about the 3-6X difference, namely points 2 and 3.
I like GiveWell’s work on the Open Philanthropy Project, which I think enable it to help channel the donations of more typical EA participants to causes like existential risk
My response would be that I find your perspective a bit optimistic :-) We have to keep in mind that many non-EA participants harm rather than improve the world, for example people who produce cigarettes. While that one is obvious, I would argue that more broadly, the general consumer market has many downside, and some aspects of it harm rather than improve the world. Likewise, some great researchers and philanthropists may be contributing rather than reducing existential risk and thus harming rather than improving the world.
This leads me to challenge the idea of an upper bound of a 100x impact gain from becoming a GWWC member, as participating in the EA movement would likely correlate with people moving away from activities that actively harm the world.
I am sympathetic to the idea that on average what humans are doing is neither good nor bad, but from that perspective we are back at the divide by zero problem. As we don’t have a good sense of how good the work of a typical person is—not even whether it’s positive, neutral or negative—we should avoid using it as a unit of measure.
I generally see people as making the world slightly better. It seems that the world has become slightly better over time, in comparison to the past. So before the EA movement was around, the world was improving. This suggests a non-zero value for the “typical person.”
However, the EA movement, per individual within it, has improve the world a great deal more than the typical person, and has the potential for improving it even further as it gets more value-aligned people on board, or more non-value aligned people behaving like EA participants. So this is the value I am trying to get at with the large difference between “typical person” and “EA participant.” We can have a conversation about the numbers, of course :-)
The 3-6x number was a vague guess. I would revise it to 10x, 20x or more for averages.
Fair enough :-) What are your motivations for giving a different number? What do you think is closer to the truth and why do you think so?
I was only thinking of the amount of money moved when I said that number. By EAs being more informed about principles of cause selection and mechanisms to do good in the world, their money will go much further. Second-order long term epistemic effects of the movement being populated with more involved and serious people are difficult to quantify but probably significant.
Cool, thanks for sharing that!