Given the small amount, I didn’t put much thought in, and hence I don’t want to put detailed reasons here, to avoid spreading inaccurate memes. The very basic reason I chose an organisation working on AI safety was concern for the long-term future of humanity.
Second, I’m planning to do the rest and bulk of my giving through the donor lottery, mostly for the standard reasons found at the link. (One sentence summary: The expected amount donated is the same as if given directly, but if you win, the much higher amount will justify putting more careful thought into your donation.)
Specifically I am giving to the 100k block. That is because at 20k I would probably rather lose than win. The amount would be big enough for it to be important to put in effort and research, but the amount may be too small to really justify delaying any career opportunities. At 100k I’d rather win: At that point I think it would be worth taking some time off to focus on this. That would be super interesting and hopefully help me fine-tune my thinking on some important EA matters. It could also tell me whether I would enjoy being a grant-maker and give me something to show if I decided I did. Given I’d rather win than lose at 100k, 500k is out of the question.
In the past I had donated through the EA funds. I still think that’s probably a decent way of giving and might even end up giving the money there if I won the lottery, though I probably wouldn’t.
Part of the reason I chose the lottery instead is that I think they are closer to the optimum on the spectrum of how big the donation decisions are individuals get to make: Donors in the range of up to a few thousand dollars may not put in enough research to make optimal decisions. At the other extreme, if individuals get to decide over budgets of millions or more, that may skew the total EA portfolio too far towards their idiosyncratic preferences.
I feel there must exist an optimum between these two and that this optimum is probably very roughly in the 100k ballpark. However, I don’t have strong opinions on the exact order of magnitude. It may be that it is in the millions, in which case, the specific argument above against donating to the EA funds vanishes. Of course, things would depend on the specifics as well: maybe spectacularly good allocators should get much larger chunks, though it is probably very hard to tell who that is.
First I’ve donated 10 dollars to Ought here (effectively 35):
Make a $10 donation into $35 - EA Forum (effectivealtruism.org)
Given the small amount, I didn’t put much thought in, and hence I don’t want to put detailed reasons here, to avoid spreading inaccurate memes. The very basic reason I chose an organisation working on AI safety was concern for the long-term future of humanity.
Second, I’m planning to do the rest and bulk of my giving through the donor lottery, mostly for the standard reasons found at the link. (One sentence summary: The expected amount donated is the same as if given directly, but if you win, the much higher amount will justify putting more careful thought into your donation.)
Specifically I am giving to the 100k block. That is because at 20k I would probably rather lose than win. The amount would be big enough for it to be important to put in effort and research, but the amount may be too small to really justify delaying any career opportunities. At 100k I’d rather win: At that point I think it would be worth taking some time off to focus on this. That would be super interesting and hopefully help me fine-tune my thinking on some important EA matters. It could also tell me whether I would enjoy being a grant-maker and give me something to show if I decided I did. Given I’d rather win than lose at 100k, 500k is out of the question.
In the past I had donated through the EA funds. I still think that’s probably a decent way of giving and might even end up giving the money there if I won the lottery, though I probably wouldn’t.
Part of the reason I chose the lottery instead is that I think they are closer to the optimum on the spectrum of how big the donation decisions are individuals get to make:
Donors in the range of up to a few thousand dollars may not put in enough research to make optimal decisions.
At the other extreme, if individuals get to decide over budgets of millions or more, that may skew the total EA portfolio too far towards their idiosyncratic preferences.
I feel there must exist an optimum between these two and that this optimum is probably very roughly in the 100k ballpark. However, I don’t have strong opinions on the exact order of magnitude. It may be that it is in the millions, in which case, the specific argument above against donating to the EA funds vanishes. Of course, things would depend on the specifics as well: maybe spectacularly good allocators should get much larger chunks, though it is probably very hard to tell who that is.
Interesting reasoning, thanks for sharing!
Regarding the optimum size for an individual donor to be, you or others might find this post (at least tangentially) interesting, if you haven’t seen it already: Risk-neutral donors should plan to make bets at the margin at least as well as giga-donors in expectation.