Let’s say you’re going to donate some money, and plan to put some time into figuring out the best place to donate. The more time you put into your decision the better it’s likely to be, but at some point you need to stop looking and actually make the donation. The more you’re donating the longer it makes sense to go, since it’s more valuable to better direct a larger amount of money. Can you get the benefits of the deeper research without increasing time spent on research? Weirdly, you can! About five years ago, Carl Schulman proposed (and ran the first) “donor lottery”.
Say instead of donating $1k, you get together with 100 other people and you each put in $1k. You select one of the people at random (1:100) to choose where the pool ($100k) goes. This turns your 100% chance of directing $1k into a 1% chance of directing $100k. The goal is to make research more efficient:
If you win, you’re working with enough money that it’s worth it for you to put serious time into figuring out your best donation option.
If you lose, you don’t need to put any time into determining where your money should go.
This isn’t that different from giving your money to GiveWell or similar to decide how to distribute, in that it’s delegating the decision to someone who can put in more research. Except that it doesn’t require identifying someone better at allocating funds than you are, it just requires that you would:
Make better decisions in allocating $100k than $1k
Prefer a 1% chance of a well-allocated $100k to a 100% chance of a somewhat less well-allocated $1k.
If you come at this from a theoretical perspective this can seem really neat: better considered donations, more efficient use of research time, strictly better, literally no downsides, why would you not do this?
Despite basically agreeing with all of the above, however, I’ve not participated in a donor lottery, and I think they’re likely slightly negative on balance. There actually are downsides, just not in areas that the logic above considers.
The biggest downside is that it makes your donation decisions less legible: it’s harder for people to understand what you’re doing and why. Lets say you’re talking to a friend:
Friend: You’re really into charity stuff, right? Where did you decide to donate this year?You: I put my money into a donor lottery. I traded my $10k for a 1% chance of donating $1M.
Friend: Did you win?
You: No, but I’m glad I did it! Let me explain why…
There are a few different ways this could go. Your friend could:
Listen to your explanation at length, and think “this is really cool, more efficient donation allocation, I bet EA is full of great ideas like this, I should learn more!”
Listen to your explanation at length, and think “maybe this works, but it seems like it could probably go wrong somehow.”
Not have time or interest for the full explanation, and be left thinking you irresponsibly gambled away your annual donation.
I think (b) and (c) are going to happen often enough to outweigh both the benefit (a) and the benefit of the additional research. And this is in some sense a best case: your friend thinks well of you and is likely to give you the benefit of the doubt. The same conversation with someone you know less well or who doesn’t have the same baseline assumption that you’re an honest and principled person trying to do the right thing would likely go very poorly.
The other problem with that conversation is that you’re not really answering the question! They’re trying to figure out where they should donate, and are looking for your advice. Even if they come away thinking your decision to participate in the lottery makes some sense, they’re unlikely to decide to participate the first time they hear about the idea, and so do still need to decide where to give. It’s much better if you can explain what charity you picked, how you were thinking about it, and be able to answer followups. Importantly, I think this is better than explaining the donor lottery at illustrating EA thinking and helping people figure out if learning more about EA is something they’d enjoy.
I also have two smaller objections to donor lotteries:
If with some research you have a good chance of identifying better donation opportunities than “give to GiveWell or EA Funds”, I’d be excited for you to do that and write up your results. I think you’d likely influence other funders enough that the time investment would be worth it, and you’d learn a lot. If this goes well you get the benefit of winning a donor lottery without having to actually win!
When I was earning to give I was in a position similar to someone who had won a donor lottery, in that I had enough money to allocate that it would be worth putting in substantial time deciding what to do with it. But in practice I didn’t end up putting in that much time, the time I did put in didn’t shift my views, and I ended up donating to the same places I expect I would have if I’d been donating much less money.
(I think the tradeoffs were less against donor lotteries in 2016 when they were first proposed, because there were many fewer people working full time on how to allocate EA money.)
Comment via: facebook
I think most people should at least have a best guess about where they would donate if they hadn’t participated in a lottery, and if it’s easy then they should plausibly give a token amount directly (like 10%). This makes “give to lottery” much more comparable to “give directly” since you’ve still spent the time thinking about it, have your head in the game, and can talk from experience about that decision. (Another simple reason to do this is to confirm that actually diminishing returns aren’t a big deal wherever you were planning to donate.)
If someone really means to ask “where should I donate?” and “not where did you donate?” then it seems like you can just answer that question. Giving 10% to the object level makes that even easier but I feel like I can basically do that even when I don’t give directly to the particular charity I’m discussing. (That said, I probably get asked this question much less than you and by a very different distribution of people.)
I particularly like donor lotteries because I think they are one of the cleanest “wins” amongst all EA recommendations for small donors (though not one of the largest wins). I think it’s really complicated to assess whether you should defer to EAs that you don’t really know, but that there are great arguments that most donors would be better served by using a lottery. So I consider it one of the most legible ways to clearly demonstrate that EA is just better at donating than other communities (rather than merely having complex methodological disagreements).
I think that donor lotteries seem better for the health of the community than having a bunch of small donors who have no option other than deferring to a centralized recommender (whose work they often can’t check very thoroughly). It empowers more people to have meaningful influence over how charity is done. “Small donors should just defer to professionals” might be right but I don’t think it’s very legibly right, and it relies on a bunch of trust. In practice if you are donating $10-100k I think that a lot of that has to be social proof rather than having the time to dig into claims enough to reliably tell that GiveWell knows their stuff. And even if everyone defers to an evaluator, the form of accountability seems important and I’d prefer evaluators be accountable to 10x fewer donors with 10x more time. Relatedly, I think this makes it easier in practice to start a competitor and market to a smaller number of large donors based on quality product.
I do think a significant fraction of people (especially in tech or finance) respond positively to the idea of donor lotteries, and having the option value to talk about the object level or lotteries based on the audience seems good. Again, I think this might differ because we are talking to different audiences in a different way. I also think that I’m more interested in weird stuff than you, so I’m maybe particularly willing to lose out on prospective donors who aren’t willing to spend 30 minutes thinking about a plausible-looking but weird idea, or who can’t handle expected value calculations. And in general I more want EA to be the kind of place that does weird but good stuff.
I don’t think that “lots of people are thinking about the allocation of EA funds” has made it that much easier to be a small donor. Which of those people am I supposed to defer to? Also, I qualitatively don’t feel like the ratio of (money moved) vs (time spent thinking about allocation) is that much better than it was in 2016. I wouldn’t be surprised if both of those have grown ~4x in parallel.
I agree some lottery-winners give to the same kinds of things that they would have given to otherwise, but in the two cases where I have the most information it seems like the outcome was very different. (For several of these grants I think that giving smaller amounts would not have been an option.) It’s also worth noting that most small donors give much worse than you, and for many of them it would be great if “spend more time to think” got them up to “defer to GiveWell.”
I’d guess that your earning to give situation was similar to a small lottery winner but not a large lottery winner, and if you donated somewhere without significant diminishing returns then I think it might have made sense to go to an even larger scale. (Right now I personally feel like it’s pretty plausible that a philanthropist with a $10M budget should do a lottery with a big EA foundation up to a $100M budget, if they’d offer it, though I’d defer to funders and charities about what they think would make for a healthier ecosystem.)
I’m not sure that you’re making the wrong call, but I think it’s sort of weird/hypocritical to advertise EA by making donation choices that sacrifice altruistic impact in order to seem more normal.
Another effect is that I’d much rather evangelize EA to the kind of people who understand donor lotteries quickly.
“Seem more normal” isn’t quite what I’m going for; it’s more about there being value in doing things that are easier to explain, or that are are clearly valuable even from worldviews different from your own. For example, someone choosing to live with roommates so they’re able to work for a non-profit or donate more is weird, but it’s not hard to explain and people’s reaction is much more likely to be “I wouldn’t do that” than “that’s not actually good”.
I’d feel differently if I thought we were talking about a large amount of altruistic impact. If you think AI safety research or building pandemic shelters are what most needs doing you shouldn’t go do something else just because it’s easier to explain to the average person. But I think the gains from lotteries are pretty low, low enough that when you consider the downside it being more confusing it’s not worth it?
Another place this tradeoff comes up is with salary sacrifice: it’s more legible to donate money, but asking for a reduced salary has more altruistic impact.
I think that (normal charity) vs (lottery) is a clear improvement for a much wider range of worldviews than (normal charity) vs (defer to GiveWell).
I do agree that “defer to GiveWell” is easier to explain though. Or slightly more precisely: I think it’s easier to explain what GiveWell does well enough that someone can understand why you might think it’s the best option, but harder to explain what GiveWell does in enough detail that someone can verify for themselves that it’s actually better than their alternatives.
It feels like a bit of a Catch-22. To purposefully make donations in a way that you know will lead to less overall altruistic impact in order to increase first order impact also seems to run afoul of similar, albeit maybe less severe, issues.
Disclaimers:
Writing quickly about a complex topic I have many vague ideas about
I won the $500k donor lottery, and it has been a very emotional experience, I think I will be in a better position to evaluate this in ~5 years, I’m extremely biased
Even trying to take into account my strong bias, I’m strongly in favor of donor lotteries, here are my 2 cents:
I confirm that you do get negative reactions to the concept of a “donor lottery” from most people, but you get even more negative reactions when you tell people you’re donating most of your income, as I’m sure you know better than me.
I would go with the advice in why you should give to a donor lottery
I personally would recommend 50-50 for most people, but with high variance. I think it would heavily mitigate your points b) and c), and keep you informed enough about giving opportunities to provide useful advice.
You can still share with people your direct donations and I think you have roughly the same influence whether you donate X to an organization or 2X. You can mention the donor lottery only to people that might find it interesting or useful.
In my experience, there are now many funds and grantmakers to choose from! The problem moved one level up but it’s still there.
An advantage of moving to the ~$500k scale is that it makes sense for me to ask people that seem knowledgeable and trustworthy to confirm whether a fund/evaluator is actually reliable. It actually turned out that one (of the many) probably wasn’t at the time, and I definitely might have donated to it otherwise.
I also think that donating to GiveWell’s “All grants fund” is probably just better than donating directly to one of the recommended charities, because GiveWell only funds particularly effective programs, and many top charities run many programs in many countries with various levels of cost-effectiveness. But I see many donors not making these kinds of considerations that I think they would make if they won a donor lottery.
Another way to say this: many EA donors, like silly me last year, do not give to GiveWell funds or EA Funds, maybe for reasons similar to your points b) and c).
When I was donating my last 10k last year I was reading things that made me doubt the optimality of GiveWell’s recommendation (like this series). At that scale, it would have been hard to justify the time to investigate whether those claims have merit.
Some IMHO more serious downsides of the donor lottery:
Unilateralist’s curse, especially for longtermist funding where many opportunities have sign uncertainty
Potentially a less representative distribution compared to a “free market” approach where small funders distribute money according to their values.
e.g. if half of the funders care only about insect suffering and half care only about mental health, it might be better to allocate the funds accordingly, instead of having 100% insects or 100% mental health depending on who wins the lottery.
People might over update on the opinions of a random lottery winner, compared to a grantmaker that was selected meritocratically, or a funder that at least earned the money.
It might be treated by the winner like “monopoly money”, without the moral seriousness of considering the thousands of lives it impacts.
Some serious advantages of winning the donor lottery:
It makes sense to verify that the fund/evaluator you would be donating to is actually reliable with much higher confidence and compare funds with each other. (A popular one probably wasn’t until months or years ago)
It makes sense to go much deeper into EA and cause prioritization. e.g. How much should I care about animals, WELLBYs vs QALYs vs GiveWell moral weights, longtermism vs short-termism, population ethics in general, whether to fund research or proved interventions). This helps not just with allocating the donor lottery winnings, but with all future donations and career choices, and helps give much better donation advice to various friends.
I learned basic concepts like “population ethics” and “theory of change” only after winning the lottery. I expect many donors to not be familiar with these basic topics and I am still learning a lot. Also, how to decide allocation across cause areas seems a very tricky question, sensitive to ethical questions that there is no consensus on.
Many reliable people, even at large funds, encourage funding diversity. Especially for opportunities in the $50k ~ $1M range, which might be too large for very small funders and too small for very large funders. There are professional grantmakers for these, but they have different opinions on where the marginal $100k should go (which I think is good!)
Despite the perception of abundant funding for EA orgs, many of their leaders still spend a significant amount of time fundraising, which has a large opportunity cost. If you can identify things that would have otherwise been funded by EA funds anyway, you can save them significant amounts of time.
Publishing a grantmaking report after winning a donor lottery allows you to signal-boost projects that you considered especially valuable, providing them with more reach even if funding-wise you would already cover their funding gap.
It could accelerate the career of potentially promising grantmakers, the correlation between grantmaking skill and grantmaking budget seems far from perfect. You could use the lottery funds to provide valuable experience to someone that one day might work for e.g. Schmidt Futures.
It would be easier to find opportunities for leverage, you influence donations less when losing but potentially more when winning. I did end up talking to way more people about EA (with mixed results).
I think the existing donor lotteries might have been too small for someone at your scale to appreciate the advantages, would you feel the same about a ~$10M donor lottery?
In my experience, explaining donor lotteries is actually pretty engaging to the right sort of person. And I expect that sort of person to be a good fit for EA.
You are/were an extremely prominent earning-to-giver (you and Julia have blogs; you guys have been profiled on news articles for your donations a number of times). I suspect the vast majority of EA donors are not in a position where they inspire more donations-by-casual-donors than the size of their own donations.
The comparison here isn’t inspiration vs own donations, but inspiration vs marginal improvement due to better research, which is a much lower bar.
Yeah this is a good argument. I suspect they’re in the same order-of-magnitude but I agree the argument is nontrivial.
In practice most donations end up funging against Open Phil or FTX’s last dollar I guess, so a lot of this hinges on whether/how much you expect to do better than that bar.
Interestingly I recently tried this (here). It led to some money moved but less than I hoped. I would encourage others to do the same but also to have low expectations that anyone will listen to them or care or donate differently, I don’t expect the community is that agile/ responsive to suggestions.
If fact the whole experience made me much more likely to enter a donor lotteries – I now have a long list of places I expect should be funded and no where near enough funds to give so I might as well enter the lottery and see if that helps solve the problem.
[EDIT: As Linch pointed out, not necessarily good reasons to not participate in a donor lottery]
I think there are two additional criteria for participation in a donor lottery [EDIT: assuming you are less comfortable with random/speculative impact]:
1) The average winner would do at least as good a job as you in selecting where to donate as if you hadn’t participated.
It means their enormous time sink in investigating where to donate would yield donation options at least as good as those from your cursory investigation. That’s a relatively low bar, but it’s still there.
If you are unusually rational and conscientious with your giving (which includes most in the EA community), then there is a good chance your giving will be more effective than a less conscientious person, even if they spent substantially more time thinking about where to donate. I anticipate most EAs would be more likely to join a donor lottery with other EAs than a lottery with participants sampled from the general population. For a random person in the population, there is a good chance your “cursory” investigation is as good or better than their “detailed” investigation.
2) You share similar values on what is most important as the others in the donor lottery
Say I joined a donor lottery with three other people. They have different ideas of what is most beneficial to the world. For one, it is building monuments to Chtulu. For another, it is dyeing cats purple. And for the last one, it is increasing the number of shrubs in their neighborhood. Even if more time allows them to find the best giving opportunities for the things they care about, I see little additional value in a world with a higher concentration of Cthulhu monuments, purple cats, or neighborhood shrubbery.
I think this is incorrect; if a normal lottery ends up being +EV it’s still rational to put your money in it, even if you then think the other lottery participants have worse epistemics or values to you (which is normal and unsurprising).
Yeah, I suppose that’s true. It’s still +EV even if the money disappears into a blackhole if you don’t win, because if you win, you get to spend more time deciding where to donate.
I guess my hesitation around the lotteries is I’m uncomfortable with expected value calculations at the extremes (e.g. 1 in a million chance to win for a million more impact), and that I’d experience regret if I didn’t think the eventually winner was thoughtful in their giving (even if it is still +EV).
We’re not talking about one in a million odds, though? We’re talking about 1%
I mean, I’d still be hesitant at a 1% chance. Let’s say I give my donations every year for 40 years to a donor lottery at a 1% chance. Then it’s a 0.99^40 = 67% chance I’d never get to choose where the money goes. Personally, I’m not sure I could handle that. I’d more comfortable with 10% odds.
Seems similar to startups as an earning to give approach. It makes more sense for 10 people to attempt start-ups than to pursue high-paying careers because the expected value is larger. But it really sucks for the people that never succeed, and many probably give up earning to give and leave the movement altogether. It’s a high price to pay in terms of community and mental health for greater expected aggregate impact.
Ideally there would be some way to involve lottery losers in the win—at least acknowledging them, or having them give feedback on a draft giving proposal. That would help counteract the wearing feeling of pitching money into the void, without negating too much of the benefit of having one person do all the heavy work of deciding where to give.
I think this post misses something: the impact of a donation is typically a nonlinear function of donation size.
Specifically, a donation of (say) $200,000 typically achieves more than 100x more impact than a donation of $2,000.
Why?
Because with $200,000, a charity can take meaningful actions now, whereas with $2,000 the funds will likely sit in the charity’s bank account until the charity has built up enough funds from other sources to use the money meaningfully.
And money has a time value.
Details/caveats:
Note that this depends on the “unit of action” for the organisation. E.g. for AMF, it may be that $2,000 is enough to do another distribution of bednets (I haven’t checked this). In which case the value of donations probably does scale linearly (at least when comparing a $2,000 donation and a $200,000 donation). But the non-linearity point would be valid if we were comparing $20 with $2,000 (i.e. $2,000 would be more than 100x as valuable as $20).
However for lots of organisations the “unit of action” is the amount of money needed to fund another person’s salary, in which case the non-linearity point would apply in the example I gave ($2,000 vs $200,000).
Obviously all of this is assuming favourable Room For More Funding conditions—i.e. that we have not yet hit diminishing marginal returns on donations.
Also, the point about “money has a time value” hides lots of detail which I haven’t gone into here, but can do if someone requests it.
There are also extra transaction costs in transferring money through an extra step, although that argument can also be made of regrantors like EA Funds and probably isn’t a big enough deal to change what you should decide
I think you point to relevant tradeoffs here. I myself am currently testing a different scheme to determine my donation: let the public (or anyone interested in participating) decide collectively where to donate. I believe this might...
improve the decision quality due to crowd-sourcing information and reducing any bias (moral or otherwise) that I might have
encourage others to think about various causes and potentially make them donate more as well
The downside is probably that the total effort spent in this decision is larger than would I take it on my own...
My polls look like this, sometimes with a fixed total amount, sometimes with a total that increases with participation (to give an incentive). Typically around 100 people participate in it:
What do you think of this?