I happen to agree with Eliezer that careful thought shows MWI to be unambiguously correct, and given that, the more extreme his confidence in this (IMO correct) claim, the more credit he deserves.
‘The more probability someone assigns to a claim, the more credit they get when the claim turns out to be true’ is true as a matter of Bayesian math. And I agree with you that MWI is true, and that we have enough evidence to say it’s true with very high confidence, if by ‘MWI’ we just mean a conjunction like “Objective collapse is false.” and “Quantum non-realism is false / the entire complex amplitude is in some important sense real”.
(I think Eliezer had a conjunction like this in mind when he talked about ‘MWI’ in the Sequences; he wasn’t claiming that decoherence explains the Born rule, and he certainly wasn’t claiming that we need to reify ‘worlds’ as a fundamental thing. I think a better term for MWI might be the ‘Much World Interpretation’, since the basic point is about how much stuff there is, not about a division of that stuff into discrete ‘worlds’.)
That said, I have no objection in principle to someone saying ‘Eliezer was right about MWI (and gets more points insofar as he was correct), but I also dock him more points than he gained because I think he was massively overconfident’.
E.g., imagine someone who assigns probability 1 (or probability .999999999) to a coin flip coming up heads. If the coin then comes up heads, then I’m going to either assume they were trolling me, or I’m going to infer that they’re very bad at reasoning. Even if they somehow rigged the coin, .999999999 is just too extreme a probability to be justified here.
By the same logic, if Eliezer had said that MWI is true with probability 1, or if he’d put too many ‘9s’ at the end of his .99… probability assignment, then I’d probably dock him more points than he gained for being object-level-correct. (Or I’d at least assume he has a terrible understanding of how Bayesian probability works. Someone could indeed be very miscalibrated and bad at talking in probabilistic terms, and yet be very knowledgeable and correct on object-level questions like MWI.)
I’m not sure exactly how many 9s is too many in the case of MWI, but it’s obviously possible to have too many 9s here. E.g., a hundred 9s would be too many! So I think this objection can make sense; I just don’t think Eliezer is in fact overconfident about MWI.
‘The more probability someone assigns to a claim, the more credit they get when the claim turns out to be true’ is true as a matter of Bayesian math. And I agree with you that MWI is true, and that we have enough evidence to say it’s true with very high confidence, if by ‘MWI’ we just mean a conjunction like “Objective collapse is false.” and “Quantum non-realism is false / the entire complex amplitude is in some important sense real”.
(I think Eliezer had a conjunction like this in mind when he talked about ‘MWI’ in the Sequences; he wasn’t claiming that decoherence explains the Born rule, and he certainly wasn’t claiming that we need to reify ‘worlds’ as a fundamental thing. I think a better term for MWI might be the ‘Much World Interpretation’, since the basic point is about how much stuff there is, not about a division of that stuff into discrete ‘worlds’.)
That said, I have no objection in principle to someone saying ‘Eliezer was right about MWI (and gets more points insofar as he was correct), but I also dock him more points than he gained because I think he was massively overconfident’.
E.g., imagine someone who assigns probability 1 (or probability .999999999) to a coin flip coming up heads. If the coin then comes up heads, then I’m going to either assume they were trolling me, or I’m going to infer that they’re very bad at reasoning. Even if they somehow rigged the coin, .999999999 is just too extreme a probability to be justified here.
By the same logic, if Eliezer had said that MWI is true with probability 1, or if he’d put too many ‘9s’ at the end of his .99… probability assignment, then I’d probably dock him more points than he gained for being object-level-correct. (Or I’d at least assume he has a terrible understanding of how Bayesian probability works. Someone could indeed be very miscalibrated and bad at talking in probabilistic terms, and yet be very knowledgeable and correct on object-level questions like MWI.)
I’m not sure exactly how many 9s is too many in the case of MWI, but it’s obviously possible to have too many 9s here. E.g., a hundred 9s would be too many! So I think this objection can make sense; I just don’t think Eliezer is in fact overconfident about MWI.
Fair enough, thanks.