If we are taking the assumed donor behavior as given, and if the sole objective is maximizing donations to charity, this makes sense. But there is an available option that would be better for both the EA that is earning to give and the charity. The E2Ger could take the $100k job and donate 32%. With even slightly diminishing marginal utility of consumption, the E2Ger would be better off consuming $68k with certainty than having a 80% chance of consuming $45k, a 10% chance of consuming $50k, and a 10% chance of consuming $275k. And the charity would get slightly more in expectation ($32k rather than $31.5k).
In practice, I think there is usually a tradeoff between risk and expected value when choosing among E2G jobs/careers, so choosing riskier options and donating a higher percentage when outcomes are favorable will tend to be the right policy. I’m just not sure that the main argument presented here strengthens the case for doing so.
This is something I’m thinking about for my personal situation, and I strongly agree with this comment (but don’t have a lot of actual data to back this view).
Considering the subset of people that are donating everything after rent and food, this model might predict lower total donations for the higher variance distribution (I expect rent and food costs to increase once you have a more intense and higher risk job, because of opportunity costs).
But I think that in that case it’s still very likely that choosing riskier options will have a much higher expected impact.
I think I would model income with a ~lognormal distribution with higher variance and higher EV than the one in the post (I would expect many to get way less than 50k, <1% to get in debt, and a few to get much more than 500k, but would love to see data on this).
[Edit: thinking about this more, it’s not at all obvious to me that the EV would be higher in the more risky approach, instead of lower, for the average person that’s undecided]
Since most EAs are younger than ~45, the biggest advantage could be in terms of career capital as mentioned in the post.
I expect riskier work tends to be higher effort, lead to more upskilling, and move comparative advantage away from e.g. “optimize the company obscure proprietary database” to things that are more generally useful for direct work.
So I think the post actually undersells the main advantages of being risk-seeking in earning to give.
I think it also undersells the personal costs. Asking people giving 10k of 100k to work harder with a 90% chance of losing income (possibly in a significant way), all for a chance of becoming a millionaire, seems a big ask.
So I wonder:
Would most people prefer to give more than 10% given the choice, like you mentioned?
Is the expected value of riskier options actually higher in practice?
How could we quantify the difference in career capital?
If we are taking the assumed donor behavior as given, and if the sole objective is maximizing donations to charity, this makes sense. But there is an available option that would be better for both the EA that is earning to give and the charity. The E2Ger could take the $100k job and donate 32%. With even slightly diminishing marginal utility of consumption, the E2Ger would be better off consuming $68k with certainty than having a 80% chance of consuming $45k, a 10% chance of consuming $50k, and a 10% chance of consuming $275k. And the charity would get slightly more in expectation ($32k rather than $31.5k).
In practice, I think there is usually a tradeoff between risk and expected value when choosing among E2G jobs/careers, so choosing riskier options and donating a higher percentage when outcomes are favorable will tend to be the right policy. I’m just not sure that the main argument presented here strengthens the case for doing so.
This is something I’m thinking about for my personal situation, and I strongly agree with this comment (but don’t have a lot of actual data to back this view).
Considering the subset of people that are donating everything after rent and food, this model might predict lower total donations for the higher variance distribution (I expect rent and food costs to increase once you have a more intense and higher risk job, because of opportunity costs).
But I think that in that case it’s still very likely that choosing riskier options will have a much higher expected impact. I think I would model income with a ~lognormal distribution with higher variance and higher EV than the one in the post (I would expect many to get way less than 50k, <1% to get in debt, and a few to get much more than 500k, but would love to see data on this). [Edit: thinking about this more, it’s not at all obvious to me that the EV would be higher in the more risky approach, instead of lower, for the average person that’s undecided]
Since most EAs are younger than ~45, the biggest advantage could be in terms of career capital as mentioned in the post. I expect riskier work tends to be higher effort, lead to more upskilling, and move comparative advantage away from e.g. “optimize the company obscure proprietary database” to things that are more generally useful for direct work.
So I think the post actually undersells the main advantages of being risk-seeking in earning to give.
I think it also undersells the personal costs. Asking people giving 10k of 100k to work harder with a 90% chance of losing income (possibly in a significant way), all for a chance of becoming a millionaire, seems a big ask.
So I wonder:
Would most people prefer to give more than 10% given the choice, like you mentioned?
Is the expected value of riskier options actually higher in practice?
How could we quantify the difference in career capital?