In reality, if we can figure out how to give a lot for one or two years without becoming selfish, we are more likely to sustain that for a longer period of time. This boosts the case for making larger donations.
Yep, I agree. In general, the real-life case is going to be more complicated in a bunch of ways, which tug in both directions.
Still, I suspect that, even if someone managed to donate a lot for a few years, there’d still be some small independent risk of giving up each year. And even a small such risk cuts down your expected lifetime donations by quite a bit: e.g., a 1% p.a. risk of giving up for 37 years cuts down the expected value by 16% (and far more if your income increases over time).
Moreover, I rather doubt that the probability of turning selfish and giving up on Effective Altruism can be nearly as high as 50% in a given year. If it were that high, I think we’d have more evidence of it, in spite of the typical worries regarding how we can hear back from people who aren’t interested anymore.
Yep, that seems right. Certainly at the 10% donation level, it should be a lot lower than 50% (I hope!). I was thinking of 50% p.a. as the probability of giving up after ramping up to 90% per year, at least in my own circumstances (living on a pretty modest grad student stipend).
Also, there’s a little bit of relevant data on this in this post. Among the 38 people that person surveyed, the dropout rate was >50% over 5 years. So it’s pretty high at least. But not clear how much of that was due to feeling it was too demanding and then getting demotivated, rather than value drift.
Also, this doesn’t break your point, but I think percentages are the wrong way to think about this. In reality, donations should be much more dependent upon local cost of living than upon your personal salary. If COL is $40k and you make $50k then donate up to $10k. If COL is $40k and you make $200k then donate up to $160k.
Yes, good point! I’d agree that that’s a better way to look at it, especially for making broad generalisations over different people.
Yep, I agree. In general, the real-life case is going to be more complicated in a bunch of ways, which tug in both directions.
Still, I suspect that, even if someone managed to donate a lot for a few years, there’d still be some small independent risk of giving up each year. And even a small such risk cuts down your expected lifetime donations by quite a bit: e.g., a 1% p.a. risk of giving up for 37 years cuts down the expected value by 16% (and far more if your income increases over time).
Yep, that seems right. Certainly at the 10% donation level, it should be a lot lower than 50% (I hope!). I was thinking of 50% p.a. as the probability of giving up after ramping up to 90% per year, at least in my own circumstances (living on a pretty modest grad student stipend).
Also, there’s a little bit of relevant data on this in this post. Among the 38 people that person surveyed, the dropout rate was >50% over 5 years. So it’s pretty high at least. But not clear how much of that was due to feeling it was too demanding and then getting demotivated, rather than value drift.
Yes, good point! I’d agree that that’s a better way to look at it, especially for making broad generalisations over different people.