I think it’s pretty miscalibrated to assign a 10^-5 chance (or 1 in 100,000) chance that we’re in the Time of Perils.
Would you be interested in making a $2000:$1 bet that you will change your mind in the next 10 years and think that the chance we’re in the Time of Perils is >50%? (I’m also happy to bet larger numbers at that ratio).
I think this is a pretty good deal for you, if I did the math correctly:
your fair rate is >=50,000:1 for the proposition being false. So I’m offering you a 25x discount.
the proposition could be correct but you didn’t update enough to change your mind all the way up to 50% in the next 10 years.
you get to resolve this bet according to your own beliefs, so there’s some leeway.
I might forget about this bet and probably won’t chase you for the money.
-How can you judge calibration vs miscalculation on a question like this?
-David changing his mind doesn’t seem like a good proxy, because in this context a change of mind might be better explained by cultural factors than his prior being miscalibrated
Sure, the more idealized bet would be to commit to whatever the equivalent of “my estate” will be in 2223 to give him $1, and for David’s estate to give my estate back $2000 inflation-adjusted dollars in 1,000,200 years from now.
But this seems hard to pull off logistically. 10 years is already a long time.
David changing his mind doesn’t seem like a good proxy, because in this context a change of mind might be better explained by cultural factors
I don’t know, man, it sure feels like at some level “the progenitor of a theory disavows it after some deliberation” should be one of the stronger pieces of evidence we have that a theory is false, in worlds where empirical evidence is very hard to get quickly.
I think it’s pretty miscalibrated to assign a 10^-5 chance (or 1 in 100,000) chance that we’re in the Time of Perils.
Would you be interested in making a $2000:$1 bet that you will change your mind in the next 10 years and think that the chance we’re in the Time of Perils is >50%? (I’m also happy to bet larger numbers at that ratio).
I think this is a pretty good deal for you, if I did the math correctly:
your fair rate is >=50,000:1 for the proposition being false. So I’m offering you a 25x discount.
the proposition could be correct but you didn’t update enough to change your mind all the way up to 50% in the next 10 years.
you get to resolve this bet according to your own beliefs, so there’s some leeway.
I might forget about this bet and probably won’t chase you for the money.
-How can you judge calibration vs miscalculation on a question like this? -David changing his mind doesn’t seem like a good proxy, because in this context a change of mind might be better explained by cultural factors than his prior being miscalibrated
Sure, the more idealized bet would be to commit to whatever the equivalent of “my estate” will be in 2223 to give him $1, and for David’s estate to give my estate back $2000 inflation-adjusted dollars in 1,000,200 years from now.
But this seems hard to pull off logistically. 10 years is already a long time.
I don’t know, man, it sure feels like at some level “the progenitor of a theory disavows it after some deliberation” should be one of the stronger pieces of evidence we have that a theory is false, in worlds where empirical evidence is very hard to get quickly.
I like the category of bets that says: “You believe X; I predict you will change your mind, let’s bet on it”.
I think this does promote good epistemics.