Much of the direct work that needs doing is network constrained (i.e. requires mentorship, in part to help people gain context they need to form good plans)
There’s only so much space in the movement for direct work, and it’s unhealthy to set expectations that direct work is what people are “supposed to be.”
I think the “default action” for most EAs should be something that is:
Simple, easy, and reasonably impactful
Provides a route for people who want to put in more effort to do so, while practicing building actual models of the EA ecosystem.
I don’t think it’s really worth it for someone donating a few thousand dollars to put a lot of effort into evaluating where to donate. But if 50 people each put $2000 into a donation lottery, then they collectively have $100,000, which is enough to justify at least one person’s time in thinking seriously about where to put it. (It’s also enough to angel-invest in a new person or org, allowing them to vet new orgs as well as existing ones)
I think it’s probably more useful for one person to put serious effort into allocating $100,000, than 50 people to put token effort into allocating $2000.
This seems better than generic Earning to Give to me (except for people who make enough for donating, say, $25,000 or more realistic)
I also think there’s some potential to re-orient the EA pipeline around this concept. If local EA meetups did a collective donor lottery, then even if only one of them ends up allocating the money, they could still solicit help from others to think about it.
My experience is that EA meetups struggle a bit with “what do we actually do to maintain community cohesiveness, given that for many of us our core action is something we do a couple times per year, mostly privately.” If a local meetup did a collective donor lottery, than even if only one person wins the lottery, they could still solicit help from others to evaluate donor targets, and make it a collective group project. (while being the sort of project that’s okay if some people flake on)
My intuition is that the EA Funds are usually a much better opportunity in terms of donation impact than donor lotteries and having one person do independent research themself (instead of relying almost entirely on recommendations), unless you think you can do better (according to your own ethical views) than the researchers for each fund. They typically have at least a few years of experience in research in their respective areas, often full-time, they have the time to consider many different neglected opportunities, and they probably get more feedback than you’ll seek. I think the average EA is unlikely to have the time or expertise to compete, especially if they’re working full-time in an unrelated area. If your ethical views are similar to the grantmakers of your preferred EA fund, I’d expect every dollar not given to the fund (either directly or after winning the lottery) would be better given to the fund, and the difference could be pretty big.
Of course, you could enter a donor lottery and, if you win, just give it all to an EA fund without doing any research yourself. I don’t know if this would be better or worse than just donating directly to the EA funds. I don’t think the argument for economies of scale really applies here, since the grantmakers are already working full-time on research in the areas they’re making grants for.
Maybe a good approach would be to enter the lottery, and if you win, do research on charities, seeking feedback from the EA community and specifically the grantmakers of the EA fund you align most with, and then just donate everything to the fund. If your research is good enough, it’ll inform their recommendations. Maybe this would be less motivating if you expect your research to not be used, but it could be more motivating, because of the feedback and the extra external pressure (to produce something valuable for the fund and to not look stupid).
Of course, you could enter a donor lottery and, if you win, just give it all to an EA fund without doing any research yourself. I don’t know if this would be better or worse than just donating directly to the EA funds.
It seems to me like this is unlikely to be worse. Is there some mechanism you have in mind? Risk-aversion for the EA fund? (Quantitatively that seems like it should matter very little at the scale of $100,000.)
At a minimum, it seems like the EA funds are healthier if their accountability is to a smaller number of larger donors who are better able to think about what they are doing.
In terms of upside from getting to think longer, I don’t think it’s at all obvious that most donors would decide on EA funds (or on whichever particular EA fund they initially lean towards). And as a norm, I think it’s easy for EAs to argue that donor lotteries are an improvement over what most non-EA donors do, while the argument for EA funds comes down a lot to personal trust.
I don’t think the argument for economies of scale really applies here, since the grantmakers are already working full-time on research in the areas they’re making grants for.
I don’t think all of the funds have grantmakers working fulltime on having better views about grantmaking. That said, you can’t work fulltime if you win a $100,000 lottery either. I agree you are likely to come down to deciding whose advice to trust and doing meta-level reasoning.
I don’t really see how this could be worse. Is there some mechanism you have in mind?
I think it might be, in one way, better for the grantmakers if the total of donations they receive each year has lower variance, for decisions about bringing in more grantmakers or allocating more time to thinking about grants. I think many of the grantmakers already work more than full-time, so they may not be so flexible in choosing how much extra time they can spend on research for the grants. I suppose they could just save most of the “extra” donations for future disbursements, though.
At a minimum, it seems like the EA funds are healthier if their accountability is to a smaller number of larger donors who are better able to think about what they are doing.
Makes sense.
In terms of upside from getting to think longer, I don’t think it’s at all obvious that most donors would decide on EA funds (or on whichever particular EA fund they initially lean towards). And as a norm, I think it’s easy for EAs to argue that donor lotteries are an improvement over what most non-EA donors do, while the argument for EA funds comes down a lot to personal trust.
Besides more talent (as Stefan added) and expertise (including awareness of a much larger number of opportunities) on average, I think grantmakers also have better processes in place for their research, e.g. more feedback. I think at least one of the following four will apply to almost all EAs:
1. They have different priors or ethical views from the grantmakers and these have a large impact on how good the different charities would look as opportunities, if they had the same information. I think this could apply to a significant proportion.
2. They would be roughly as good at research for grantmaking for one of the EA funds, considering also the time they’ll have to think about it. This seems unlikely to apply to a significant proportion. I’d guess < 1% of EAs, and < 1% of EAs to which 1 doesn’t apply.
3. They have (or will have) important information about specific opportunities the grantmakers wouldn’t have that would be good enough to change the grants made by the grantmakers. I’d guess this would be much less than half of EAs, including much less than half of EAs to which 1 doesn’t apply.
4. They should actually defer to the grantmakers.
So, for most EAs, if 1 doesn’t apply to them, i.e. they don’t differ too much in their priors and ethical views from the grantmakers of one fund, then they should be giving to that fund.
I don’t think all of the funds have grantmakers working fulltime on having better views about grantmaking.
Global Health and Development has Elie Hassenfield from GiveWell, and each of the others at least has a Program Officer from the OPP, either Lewis Bollard or Nick Beckstead. 3 others in Animal Welfare work at a charity fund (Kieran Greg, Farmed Animal Funders), an org that gives donation advice (Natalie Cargill, Effective Giving), and a charity that does prioritization and charity foundation research (Karolina Sarek, Charity Entrepreneurship) in charity research or donor advice roles, and the last one leads another grant program (Alexandria Beck, Open Wing Alliance).
Besides the fact that some people have much more experience, another consideration is differences in talent. My guess is that some people have much greater talent for researching donation opportunities than others.
Those other jobs often involve looking at many different opportunities, e.g. grantmaking, donor advising, prioritization research or charity evaluation. Global Health and Development has Elie Hassenfield from GiveWell, and each of the others at least has a Program Officer from the OPP, either Lewis Bollard or Nick Beckstead.
My understanding (not confident) is that those people (at least Nick Beckstead) are more something like advisors acting as a sanity check or something (or at least that they aren’t the ones putting most of the time into the funds)
My intuition is that the EA Funds are usually a much better opportunity in terms of donation impact than donor lotteries and having one person do independent research themself (instead of relying almost entirely on recommendations)
My background assumption is that it’s important to grow the number of people who can work fulltime on grant evaluation.
Remember that Givewell was originally just a few folk doing research in their spare time.
What about donor coalitions instead of donor lotteries?
Instead of 50 people putting $2000 into a lottery, you could have groups of 5-10 putting $2000 into a pot that they jointly agree where to distribute.
Pros:
-People might be more invested in the decision, but wouldn’t have to do all the research by themselves.
-Might build an even stronger sense of community. The donor coalition could meet regularly before the donation to decide where to give, and meet up after the donation for updates from the charity.
-Avoids the unilateralist’s curse.
-Less legally fraught than a lottery.
Cons:
-Time consuming for all members, not just a few.
-Decision-making by committee often leads to people picking ‘safe’, standard options.
I like this idea, though to boost your signal I’d switch the “donor coalitions” for “donor crews,” in reference to the Microsolidarity movement, which I hope will collide with the EA community soon enough.
In a nutshell, Microsolidarity argues for (1) a theory of social groups with more categories—those below—and (2) more organizational plans to consider different strategies for different categories. Therefore, I’d describe your strategy as experimenting with “donor crews” as opposed to the much more common “donor selves” where donors choose charities alone or “donor crowds” where everyone settles on donating to GiveWell or some other common aggregator. I think there is wide-open space for EA strategies revolving around crews
This certainly seems like a viable option. I agree with the pros and cons described here, and think it’d make sense for local groups to decide which one made more sense.
My take: rank-and-file-EAs (and most EA local communities) should be oriented around donor lotteries.
Background beliefs:
I think EA is vetting constrained
Much of the direct work that needs doing is network constrained (i.e. requires mentorship, in part to help people gain context they need to form good plans)
The Middle of the Middle of the EA community should focus on getting good at thinking.
There’s only so much space in the movement for direct work, and it’s unhealthy to set expectations that direct work is what people are “supposed to be.”
I think the “default action” for most EAs should be something that is:
Simple, easy, and reasonably impactful
Provides a route for people who want to put in more effort to do so, while practicing building actual models of the EA ecosystem.
I don’t think it’s really worth it for someone donating a few thousand dollars to put a lot of effort into evaluating where to donate. But if 50 people each put $2000 into a donation lottery, then they collectively have $100,000, which is enough to justify at least one person’s time in thinking seriously about where to put it. (It’s also enough to angel-invest in a new person or org, allowing them to vet new orgs as well as existing ones)
I think it’s probably more useful for one person to put serious effort into allocating $100,000, than 50 people to put token effort into allocating $2000.
This seems better than generic Earning to Give to me (except for people who make enough for donating, say, $25,000 or more realistic)
I also think there’s some potential to re-orient the EA pipeline around this concept. If local EA meetups did a collective donor lottery, then even if only one of them ends up allocating the money, they could still solicit help from others to think about it.
My experience is that EA meetups struggle a bit with “what do we actually do to maintain community cohesiveness, given that for many of us our core action is something we do a couple times per year, mostly privately.” If a local meetup did a collective donor lottery, than even if only one person wins the lottery, they could still solicit help from others to evaluate donor targets, and make it a collective group project. (while being the sort of project that’s okay if some people flake on)
My intuition is that the EA Funds are usually a much better opportunity in terms of donation impact than donor lotteries and having one person do independent research themself (instead of relying almost entirely on recommendations), unless you think you can do better (according to your own ethical views) than the researchers for each fund. They typically have at least a few years of experience in research in their respective areas, often full-time, they have the time to consider many different neglected opportunities, and they probably get more feedback than you’ll seek. I think the average EA is unlikely to have the time or expertise to compete, especially if they’re working full-time in an unrelated area. If your ethical views are similar to the grantmakers of your preferred EA fund, I’d expect every dollar not given to the fund (either directly or after winning the lottery) would be better given to the fund, and the difference could be pretty big.
Of course, you could enter a donor lottery and, if you win, just give it all to an EA fund without doing any research yourself. I don’t know if this would be better or worse than just donating directly to the EA funds. I don’t think the argument for economies of scale really applies here, since the grantmakers are already working full-time on research in the areas they’re making grants for.
Maybe a good approach would be to enter the lottery, and if you win, do research on charities, seeking feedback from the EA community and specifically the grantmakers of the EA fund you align most with, and then just donate everything to the fund. If your research is good enough, it’ll inform their recommendations. Maybe this would be less motivating if you expect your research to not be used, but it could be more motivating, because of the feedback and the extra external pressure (to produce something valuable for the fund and to not look stupid).
It seems to me like this is unlikely to be worse. Is there some mechanism you have in mind? Risk-aversion for the EA fund? (Quantitatively that seems like it should matter very little at the scale of $100,000.)
At a minimum, it seems like the EA funds are healthier if their accountability is to a smaller number of larger donors who are better able to think about what they are doing.
In terms of upside from getting to think longer, I don’t think it’s at all obvious that most donors would decide on EA funds (or on whichever particular EA fund they initially lean towards). And as a norm, I think it’s easy for EAs to argue that donor lotteries are an improvement over what most non-EA donors do, while the argument for EA funds comes down a lot to personal trust.
I don’t think all of the funds have grantmakers working fulltime on having better views about grantmaking. That said, you can’t work fulltime if you win a $100,000 lottery either. I agree you are likely to come down to deciding whose advice to trust and doing meta-level reasoning.
I think it might be, in one way, better for the grantmakers if the total of donations they receive each year has lower variance, for decisions about bringing in more grantmakers or allocating more time to thinking about grants. I think many of the grantmakers already work more than full-time, so they may not be so flexible in choosing how much extra time they can spend on research for the grants. I suppose they could just save most of the “extra” donations for future disbursements, though.
Makes sense.
Besides more talent (as Stefan added) and expertise (including awareness of a much larger number of opportunities) on average, I think grantmakers also have better processes in place for their research, e.g. more feedback. I think at least one of the following four will apply to almost all EAs:
1. They have different priors or ethical views from the grantmakers and these have a large impact on how good the different charities would look as opportunities, if they had the same information. I think this could apply to a significant proportion.
2. They would be roughly as good at research for grantmaking for one of the EA funds, considering also the time they’ll have to think about it. This seems unlikely to apply to a significant proportion. I’d guess < 1% of EAs, and < 1% of EAs to which 1 doesn’t apply.
3. They have (or will have) important information about specific opportunities the grantmakers wouldn’t have that would be good enough to change the grants made by the grantmakers. I’d guess this would be much less than half of EAs, including much less than half of EAs to which 1 doesn’t apply.
4. They should actually defer to the grantmakers.
So, for most EAs, if 1 doesn’t apply to them, i.e. they don’t differ too much in their priors and ethical views from the grantmakers of one fund, then they should be giving to that fund.
Global Health and Development has Elie Hassenfield from GiveWell, and each of the others at least has a Program Officer from the OPP, either Lewis Bollard or Nick Beckstead. 3 others in Animal Welfare work at a charity fund (Kieran Greg, Farmed Animal Funders), an org that gives donation advice (Natalie Cargill, Effective Giving), and a charity that does prioritization and charity foundation research (Karolina Sarek, Charity Entrepreneurship) in charity research or donor advice roles, and the last one leads another grant program (Alexandria Beck, Open Wing Alliance).
Besides the fact that some people have much more experience, another consideration is differences in talent. My guess is that some people have much greater talent for researching donation opportunities than others.
“EA Funds … have the time to consider many different neglected opportunities”
I just want to point out that the administrators of EA Funds are volunteers working other full time jobs.
Those other jobs often involve looking at many different opportunities, e.g. grantmaking, donor advising, prioritization research or charity evaluation. Global Health and Development has Elie Hassenfield from GiveWell, and each of the others at least has a Program Officer from the OPP, either Lewis Bollard or Nick Beckstead.
My understanding (not confident) is that those people (at least Nick Beckstead) are more something like advisors acting as a sanity check or something (or at least that they aren’t the ones putting most of the time into the funds)
My background assumption is that it’s important to grow the number of people who can work fulltime on grant evaluation.
Remember that Givewell was originally just a few folk doing research in their spare time.
What about donor coalitions instead of donor lotteries?
Instead of 50 people putting $2000 into a lottery, you could have groups of 5-10 putting $2000 into a pot that they jointly agree where to distribute.
Pros:
-People might be more invested in the decision, but wouldn’t have to do all the research by themselves.
-Might build an even stronger sense of community. The donor coalition could meet regularly before the donation to decide where to give, and meet up after the donation for updates from the charity.
-Avoids the unilateralist’s curse.
-Less legally fraught than a lottery.
Cons:
-Time consuming for all members, not just a few.
-Decision-making by committee often leads to people picking ‘safe’, standard options.
I like this idea, though to boost your signal I’d switch the “donor coalitions” for “donor crews,” in reference to the Microsolidarity movement, which I hope will collide with the EA community soon enough.
In a nutshell, Microsolidarity argues for (1) a theory of social groups with more categories—those below—and (2) more organizational plans to consider different strategies for different categories. Therefore, I’d describe your strategy as experimenting with “donor crews” as opposed to the much more common “donor selves” where donors choose charities alone or “donor crowds” where everyone settles on donating to GiveWell or some other common aggregator. I think there is wide-open space for EA strategies revolving around crews
Self (1 person)
Dyad (2 people)
Crew (3-8)
Congregation (30-200)
Crowd (200+)
This certainly seems like a viable option. I agree with the pros and cons described here, and think it’d make sense for local groups to decide which one made more sense.