Here’s an argument for an EA billionaire advantage.
Suppose you have a $500 million fortune and an opportunity presents itself for a gamble where you have a 1% chance of taking that to $60 billion but a 99% chance of ending up with nothing. Just in dollars this is positive expected value because $600 million > $500 million. But I think most people would reject that bet due to risk aversion and the declining marginal utility of money.
To a highly motivated EA, though, it looks like a better deal so you’re more likely to go for it.
Ah yes, for what it’s worth, I do allude to this (as does Patel, who I’m paraphrasing): “Effective altruists are more risk-tolerant by default, since you don’t get diminishing returns on larger donations the same way you do on increased personal consumption.”
I feel like this should be accounted for in the EA base rate, but maybe the effect has gotten or will get more pronounced now as Sam Bankman-Fried is vocal about having this mindset.
Here’s an argument for an EA billionaire advantage.
Suppose you have a $500 million fortune and an opportunity presents itself for a gamble where you have a 1% chance of taking that to $60 billion but a 99% chance of ending up with nothing. Just in dollars this is positive expected value because $600 million > $500 million. But I think most people would reject that bet due to risk aversion and the declining marginal utility of money.
To a highly motivated EA, though, it looks like a better deal so you’re more likely to go for it.
Ah yes, for what it’s worth, I do allude to this (as does Patel, who I’m paraphrasing): “Effective altruists are more risk-tolerant by default, since you don’t get diminishing returns on larger donations the same way you do on increased personal consumption.”
I feel like this should be accounted for in the EA base rate, but maybe the effect has gotten or will get more pronounced now as Sam Bankman-Fried is vocal about having this mindset.