Here’s an argument for an EA billionaire advantage.
Suppose you have a $500 million fortune and an opportunity presents itself for a gamble where you have a 1% chance of taking that to $60 billion but a 99% chance of ending up with nothing. Just in dollars this is positive expected value because $600 million > $500 million. But I think most people would reject that bet due to risk aversion and the declining marginal utility of money.
To a highly motivated EA, though, it looks like a better deal so you’re more likely to go for it.
There’s actually surplus of high-risk-high-reward people in the world, to the point where people would sacrifice the $500 million for a 1% chance of getting $40 billion. They’re not just paying the extra fee for the possibility of becoming a billionaire and lording over everyone else, they’re also paying another even more extra fees fee to compete against other people who are competing for that slot, due to the sheer number of people who psychologically want to become a billionaire and lord over everyone else.
In other words, it becomes a lottery.
Even worse, in fact, because the real world has information asymmetry and is rigged to scam in more complicated ways than lotteries. Such as data poisoning and perimeterless security.
Upvoted because I don’t think this tension is discussed enough, even if to refute it.
It strikes me that the median non-EA is more risk averse than EAs should be, so moving non-EA to EA you should probably drop some of your risk aversion. But it does also seem true that the top performing people in your field might disproportionately be people who took negative EV bets and got lucky, so we don’t necessarily want to be less risk averse than them.
^This is a really important and I completely missed this. It’s similar to how the winner of an auction tends to be the type of person who mistakenly spends more than the item was worth to them (or anyone). The most visible EAs (billionaires) could be the winners in a game with massive net loss overall. Crypto is exactly that kind of thing.
Ah yes, for what it’s worth, I do allude to this (as does Patel, who I’m paraphrasing): “Effective altruists are more risk-tolerant by default, since you don’t get diminishing returns on larger donations the same way you do on increased personal consumption.”
I feel like this should be accounted for in the EA base rate, but maybe the effect has gotten or will get more pronounced now as Sam Bankman-Fried is vocal about having this mindset.
Here’s an argument for an EA billionaire advantage.
Suppose you have a $500 million fortune and an opportunity presents itself for a gamble where you have a 1% chance of taking that to $60 billion but a 99% chance of ending up with nothing. Just in dollars this is positive expected value because $600 million > $500 million. But I think most people would reject that bet due to risk aversion and the declining marginal utility of money.
To a highly motivated EA, though, it looks like a better deal so you’re more likely to go for it.
There’s actually surplus of high-risk-high-reward people in the world, to the point where people would sacrifice the $500 million for a 1% chance of getting $40 billion. They’re not just paying the extra fee for the possibility of becoming a billionaire and lording over everyone else, they’re also paying another even more extra fees fee to compete against other people who are competing for that slot, due to the sheer number of people who psychologically want to become a billionaire and lord over everyone else.
In other words, it becomes a lottery.
Even worse, in fact, because the real world has information asymmetry and is rigged to scam in more complicated ways than lotteries. Such as data poisoning and perimeterless security.
Upvoted because I don’t think this tension is discussed enough, even if to refute it.
It strikes me that the median non-EA is more risk averse than EAs should be, so moving non-EA to EA you should probably drop some of your risk aversion. But it does also seem true that the top performing people in your field might disproportionately be people who took negative EV bets and got lucky, so we don’t necessarily want to be less risk averse than them.
^This is a really important and I completely missed this. It’s similar to how the winner of an auction tends to be the type of person who mistakenly spends more than the item was worth to them (or anyone). The most visible EAs (billionaires) could be the winners in a game with massive net loss overall. Crypto is exactly that kind of thing.
Ah yes, for what it’s worth, I do allude to this (as does Patel, who I’m paraphrasing): “Effective altruists are more risk-tolerant by default, since you don’t get diminishing returns on larger donations the same way you do on increased personal consumption.”
I feel like this should be accounted for in the EA base rate, but maybe the effect has gotten or will get more pronounced now as Sam Bankman-Fried is vocal about having this mindset.