I really don’t understand why Greg agreed to this -or why would anyone agree. It is a sure-loss situation. If he is wrong and we all (most of us) are alive and kicking in 4 years, he has to pay. If he’s right, he’s dead. Could anyone explain me why on Earth would anyone agree to this bet? Thanks.
He’s using the money to try to prevent AI doom (or at least delay it). If the money turns out to be decisive in averting AI doom, that would be worth far more to him than having to pay it back 2x + inflation.
Yes. Also another consideration is that I expect my high risk investing strategy to return >200% over the time in question, in worlds where we survive (all it would take is 1⁄10 of my start-up investments to blow up, for example, or crypto going up another >2x).
I really don’t understand why Greg agreed to this -or why would anyone agree. It is a sure-loss situation. If he is wrong and we all (most of us) are alive and kicking in 4 years, he has to pay. If he’s right, he’s dead. Could anyone explain me why on Earth would anyone agree to this bet? Thanks.
He’s using the money to try to prevent AI doom (or at least delay it). If the money turns out to be decisive in averting AI doom, that would be worth far more to him than having to pay it back 2x + inflation.
Thanks for asking!
Greg thinks a given amount of money can do more good in 2024 than at the end of 2027:
In addition, Greg does not expect to lose much money, since 10 k$ plus a 50 % chance of losing 20 k$ equals 0[1].
In reality, Greg’s median time of human extinction is a little after January 2028, so he expects to lose a little bit of money.
Yes. Also another consideration is that I expect my high risk investing strategy to return >200% over the time in question, in worlds where we survive (all it would take is 1⁄10 of my start-up investments to blow up, for example, or crypto going up another >2x).