I agree that: Yudkowsky has an impressive understanding of physics for a layman, in some situations his understanding is on par with or exceeds some experts, and he has written explanations of technical topics that even some experts like and find impressive. This includes not just you, but also e.g. Scott Aaronson, who praised his series on QM in the same answer I excerpted above, calling it entertaining, enjoyable, and getting the technical stuff mostly right. He also praised it for its conceptual goals. I don’t believe this is faint praise, especially given stereotypes of amateurs writing about physics. This is a positive part of Yudkowsky’s track record. I think my comment sounds more negative about Yudkowsky’s QM sequence than it deserves, so thanks for pushing back on that.
I’m not sure what you mean when you call yourself a pro-MWI extremist but in any case AFAIK there are physicists, including one or more prominent ones, who think MWI is really the only explanation that makes sense, although there are obviously degrees in how fervently one can hold this position and Yudkowsky seems at the extreme end of the scale in some of his writings. And he is far from the only one who thinks Copenhagen is ridiculous. These two parts of Yudkowsky’s position on MWI are not without parallel within professional physicists, and the point about Copenhagen being ridiculous is probably a point in his favor from most views (e.g. Nobel laureate Murray Gell-Mann said that Neils Bohr brainwashed people into Copenhagen), let alone this community. Perhaps I should have clarified this in my comment, although I did say that MWI is a leading interpretation and may well be correct.
The negative aspects I said in my comment were:
Yudkowsky’s confidence in MWI is disproportionate
Yudkowsky’s conviction that people who disagree with him are making elementary mistakes is disproportionate
These may come partly from a lack of knowledge or expertise
Maybe (3) is a little unfair, or sounds harsher than I meant it. It’s a bit unclear to me how seriously to take Aaronson’s quote. It seems like plenty of physicists have looked through the sequences to find glaring flaws, and basically found none (physics stackexchange). This is a nontrivial achievement in context. At the same time I expect most of the scrutiny has been to a relatively shallow level, partly because Yudkowsky is a polarizing writer. Aaronson is probably one of fairly few people who have deep technical expertise and have read the sequences with both enjoyment and a critical eye. Aaronson suggested a specific, technical flaw that may be partly responsible for Yudkowsky holding an extreme position with overconfidence and misunderstanding what people who disagree with him think. Probably this is a flaw Yudkowsky would not have made if he had worked with a professional physicist or something. But maybe Aaronson was just casually speculating and maybe this doesn’t matter too much. I don’t know. Possibly you are right to push back on the mixed states explanation.
I think (1) and (2) are well worth considering though. The argument here is not that his position is necessarily wrong or impossible, but that it is overconfident. I am not courageous enough to argue for this position to a physicist who holds some kind of extreme pro-MWI view, but I think this is a reasonable view and there’s a good chance (1) and (2) are correct. It also fits in Ben’s point 4 in the comment above: “Yudkowsky’s track record suggests a substantial bias toward dramatic and overconfident predictions.”
Suppose that the majority of eminent mathematicians believe 5+5=10, but a significant minority believes 5+5=11. Also, out of the people in the 5+5=10 camp, some say “5+5=10 and anyone who says otherwise is just totally wrong”, whereas other people said “I happen to believe that the balance of evidence is that 5+5=10, but my esteemed colleagues are reasonable people and have come to a different conclusion, so we 5+5=10 advocates should approach the issue with appropriate humility, not overconfidence.”
In this case, the fact of the matter is that 5+5=10. So in terms of who gets the most credit added to their track-record, the ranking is:
1st place: The ones who say “5+5=10 and anyone who says otherwise is just totally wrong”,
2nd place: The ones who say “I think 5+5=10, but one should be humble, not overconfident”,
3rd place: The ones who say “I think 5+5=11, but one should be humble, not overconfident”,
Last place: The ones who say “5+5=11 and anyone who says otherwise is just totally wrong.
Back to the issue here. Yudkowsky is claiming “MWI, and anyone who says otherwise is a just totally wrong”. (And I agree—that’s what I meant when I called myself a pro-MWI extremist.)
IF the fact of the matter is that careful thought shows MWI to be unambiguously correct, then Yudkowsky (and I) get more credit for being more confident. Basically, he’s going all in and betting his reputation on MWI being right, and (in this scenario) he won the bet.
Conversely, IF the fact of the matter is that careful thought shows MWI to be not unambiguously correct, then Eliezer loses the maximum number of points. He staked his reputation on MWI being right, and (in this scenario) he lost the bet.
So that’s my model, and in my model “overconfidence” per se is not really a thing in this context. Instead we first have to take a stand on the object-level controversy. I happen to agree with Eliezer that careful thought shows MWI to be unambiguously correct, and given that, the more extreme his confidence in this (IMO correct) claim, the more credit he deserves.
I’m trying to make sense of why you’re bringing up “overconfidence” here. The only thing I can think of is that you think that maybe there is simply not enough information to figure out whether MWI is right or wrong (not even for even an ideal reasoner with a brain the size of Jupiter and a billion years to ponder the topic), and therefore saying “MWI is unambiguously correct” is “overconfident”? If that’s what you’re thinking, then my reply is: if “not enough information” were the actual fact of the matter about MWI, then we should criticize Yudkowsky first and foremost for being wrong, not for being overconfident.
As for your point (2), I forget what mistakes Yudkowsky claimed that anti-MWI-advocates are making, and in particular whether he thought those mistakes were “elementary”. I am open-minded to the possibility that Yudkowsky was straw-manning the MWI critics, and that they are wrong for more interesting and subtle reasons than he gives them credit for, and in particular that he wouldn’t pass an anti-MWI ITT. (For my part, I’ve tried harder, see e.g. here.) But that’s a different topic. FWIW I don’t think of Yudkowsky as having a strong ability to explain people’s wrong opinions in a sympathetic and ITT-passing way, or if he does have that ability, then I find that he chooses not to exercise it too much in his writings. :-P
I happen to agree with Eliezer that careful thought shows MWI to be unambiguously correct, and given that, the more extreme his confidence in this (IMO correct) claim, the more credit he deserves.
‘The more probability someone assigns to a claim, the more credit they get when the claim turns out to be true’ is true as a matter of Bayesian math. And I agree with you that MWI is true, and that we have enough evidence to say it’s true with very high confidence, if by ‘MWI’ we just mean a conjunction like “Objective collapse is false.” and “Quantum non-realism is false / the entire complex amplitude is in some important sense real”.
(I think Eliezer had a conjunction like this in mind when he talked about ‘MWI’ in the Sequences; he wasn’t claiming that decoherence explains the Born rule, and he certainly wasn’t claiming that we need to reify ‘worlds’ as a fundamental thing. I think a better term for MWI might be the ‘Much World Interpretation’, since the basic point is about how much stuff there is, not about a division of that stuff into discrete ‘worlds’.)
That said, I have no objection in principle to someone saying ‘Eliezer was right about MWI (and gets more points insofar as he was correct), but I also dock him more points than he gained because I think he was massively overconfident’.
E.g., imagine someone who assigns probability 1 (or probability .999999999) to a coin flip coming up heads. If the coin then comes up heads, then I’m going to either assume they were trolling me, or I’m going to infer that they’re very bad at reasoning. Even if they somehow rigged the coin, .999999999 is just too extreme a probability to be justified here.
By the same logic, if Eliezer had said that MWI is true with probability 1, or if he’d put too many ‘9s’ at the end of his .99… probability assignment, then I’d probably dock him more points than he gained for being object-level-correct. (Or I’d at least assume he has a terrible understanding of how Bayesian probability works. Someone could indeed be very miscalibrated and bad at talking in probabilistic terms, and yet be very knowledgeable and correct on object-level questions like MWI.)
I’m not sure exactly how many 9s is too many in the case of MWI, but it’s obviously possible to have too many 9s here. E.g., a hundred 9s would be too many! So I think this objection can make sense; I just don’t think Eliezer is in fact overconfident about MWI.
I’m trying to make sense of why you’re bringing up “overconfidence” here. The only thing I can think of is that you think that maybe there is simply not enough information to figure out whether MWI is right or wrong (not even for even an ideal reasoner with a brain the size of Jupiter and a billion years to ponder the topic), and therefore saying “MWI is unambiguously correct” is “overconfident”?
Here’s my point: There is a rational limit to the amount of confidence one can have in MWI (or any belief). I don’t know where exactly this limit is for MWI-extremism but Yudkowsky clearly exceeded it sometimes. To use made up numbers, suppose:
MWI is objectively correct
Eliezer says P(MWI is correct) = 0.9999999
But rationally one can only reach P(MWI) = 0.999
Because there are remaining uncertainties that cannot be eliminated through superior thinking and careful consideration, such lack of experimental evidence, the possibility of QM getting overturned, the possibility of a new and better interpretation in the future, and unknown unknowns.
These factors add up to at least P(Not MWI) = 0.001.
Then even though Eliezer is correct about MWI being correct, he is still significantly overconfident in his belief about it.
Consider Paul’s example of Eliezer saying MWI is comparable to heliocentrism:
If we are deeply wrong about physics, then I [Paul Christiano] think this could go either way. And it still seems quite plausible that we are deeply wrong about physics in one way or another (even if not in any particular way). So I think it’s wrong to compare many-worlds to heliocentrism (as Eliezer has done). Heliocentrism is extraordinarily likely even if we are completely wrong about physics—direct observation of the solar system really is a much stronger form of evidence than a priori reasoning about the existence of other worlds.
I agree with Paul here. Heliocentrism is vastly more likely than any particular interpretation of quantum mechanics, and Eliezer was wrong to have made this comparison.
This may sound like I’m nitpicking, but I think it fits into a pattern of Eliezer making dramatic and overconfident pronouncements, and it’s relevant information for people to consider e.g. when evaluating Eliezer’s belief that p(doom) = ~1 and the AI safety situation is so hopeless that the only thing left is to die with slightly more dignity.
Of course, it’s far from the only relevant data point.
Regarding (2), I think we’re on the same page haha.
Could someone point to the actual quotes where Eliezer compares heliocentrism to MWI? I don’t generally assume that when people are ‘comparing’ two very-high-probability things, they’re saying they have the same probability. Among other things, I’d want confirmation that ‘Eliezer and Paul assign roughly the same probability to MWI, but they have different probability thresholds for comparing things to heliocentrism’ is false.
E.g., if I compare Flat Earther beliefs, beliefs in psychic powers, belief ‘AGI was secretly invented in the year 2000’, geocentrism, homeopathy, and theism to each other, it doesn’t follow that I’d assign the same probabilities to all of those six claims, or even probabilities that are within six orders of magnitude of each other.
In some contexts it might indeed Griceanly imply that all six of those things pass my threshold for ‘unlikely enough that I’m happy to call them all laughably silly views’, but different people have their threshold for that kind of thing in different places.
Gotcha, thanks. I guess we have an object-level disagreement: I think that careful thought reveals MWI to be unambiguously correct, with enough 9’s as to justify Eliezer’s tone. And you don’t. ¯\_(ツ)_/¯
(Of course, this is bound to be a judgment call; e.g. Eliezer didn’t state how many 9’s of confidence he has. It’s not like there’s a universal convention for how many 9’s are enough 9’s to state something as a fact without hedging, or how many 9’s are enough 9’s to mock the people who disagree with you.)
(Of course, this is bound to be a judgment call; e.g. Eliezer didn’t state how many 9’s of confidence he has. It’s not like there’s a universal convention for how many 9’s are enough 9’s to state something as a fact without hedging, or how many 9’s are enough 9’s to mock the people who disagree with you.)
Yes, agreed.
Let me lay out my thinking in more detail. I mean this to explain my views in more detail, not as an attempt to persuade.
Paul’s account of Aaronson’s view says that Eliezer shouldn’t be as confident in MWI as he is, which in words sounds exactly like my point, and similar to Aaronson’s stack exchange answer. But it still leaves open the question of how overconfident he was, and what, if anything, should be taken away from this. It’s possible that there’s a version of my point which is true but is also uninteresting or trivial (who cares if Yudkowsky was 10% too confident about MWI 15 years ago?).
And it’s worth reiterating that a lot of people give Eliezer credit for his writing on QM, including for being forceful in his views. I have no desire to argue against this. I had hoped to sidestep discussing this entirely since I consider it to be a separate point, but perhaps this was unfair and led to miscommunication. If someone wants to write a detailed comment/post explaining why Yudkowsky deserves a lot of credit for his QM writing, including credit for how forceful he was at times, I would be happy to read it and would likely upvote/strong upvote it depending on quality.
However, here my intention was to focus on the overconfidence aspect.
I’ll explain what I see as the epistemic mistakes Eliezer likely made to end up in an overconfident state. Why do I think Eliezer was overconfident on MWI?
(Some of the following may be wrong.)
He didn’t understand non-MWI-extremist views, which should have rationally limited his confidence
I don’t have sources for this, but I think something like this is true.
This was an avoidable mistake
Worth noting that Eliezer has updated towards the competence of elites in science since some of his early writing according to Rob’s comment elsewhere this thread
It’s possible that his technical understanding was uneven. This should also have limited his confidence.
Aaronson praised him for “actually get most of the technical stuff right”, which of course implies that not everything technical was correct.
He also suggested a specific, technical flaw in Yudkowsky’s understanding.
One big problem with having extreme conclusions based on uneven technical understanding is that you don’t know what you don’t know. And in fact Aaronson suggests a mistake Yudkowsky seems unaware of as a reason why Yudkowsky’s central argument is overstated/why Yudkowsky is overconfident about MWI.
However, it’s unclear how true/important a point this really is
At least 4 points limit confidence in P(MWI) to some degree:
Lack of experimental evidence
The possibility of QM getting overturned
The possibility of a new and better interpretation in the future
Unknown unknowns
I believe most or all of these are valid, commonly brought up points that together limit how confident anyone can be in P(MWI). Reasonable people may disagree with their weighting of course.
I am skeptical that Eliezer correctly accounted for these factors
Note that these are all points about the epistemic position Eliezer was in, not about the correctness of MWI. The first two are particular to him, and the last one applies to everyone.
Now, Rob points out that maybe the heliocentrism example is lacking context in some way (I find it a very compelling example of a super overconfident mistake if it’s not). Personally I think there are at least a couple[1][2] of places in the sequences where Yudkowsky clearly says something that I think indicates ridiculous overconfidence tied to epistemic mistakes, but to be honest I’m not excited to argue about whether some of his language 15 years ago was or wasn’t overzealous.
The reason I brought this up despite it being a pretty minor point is because I think it’s part of a general pattern of Eliezer being overconfident in his views and overstating them. I am curious how much people actually disagree with this.
Of course, whether Eliezer has a tendency to be overconfident and overstate his views is only one small data point among very many others in evaluating p(doom), the value of listening to Eliezer’s views, etc.
“Many-worlds is an obvious fact, if you have all your marbles lined up correctly (understand very basic quantum physics, know the formal probability theory of Occam’s Razor, understand Special Relativity, etc.)”
For what it’s worth, consider the claim “The Judeo-Christian God, the one who listens to prayers and so on, doesn’t exist.” I have such high confidence in this claim that I would absolutely state it as a fact without hedging, and psychoanalyze people for how they came to disagree with me. Yet there’s a massive theology literature arguing to the contrary of that claim, including by some very smart and thoughtful people, and I’ve read essentially none of this theology literature, and if you asked me to do an anti-atheism ITT I would flunk it catastrophically.
I’m not sure what lesson you’ll take from that; for all I know you yourself are very religious, and this anecdote will convince you that I have terrible judgment. But if you happen to be on the same page as me, then maybe this would be an illustration of the fact that (I claim) one can rationally and correctly arrive at extremely-confident beliefs without it needing to pass through a deep understanding and engagement with the perspectives of the people who disagree with you.
I agree that this isn’t too important a conversation, it’s just kinda interesting. :)
I’m not sure either of the quotes you cited by Eliezer require or suggest ridiculous overconfidence.
If I’ve seen some photos of a tiger in town, and I know a bunch of people in town who got eaten by an animal, and we’ve all seen some apparent tiger-prints near where people got eaten, I may well say “it’s obvious there is a tiger in town eating people.” If people used to think it was a bear, but that belief was formed based on priors when we didn’t yet have any hard evidence about the tiger, I may be frustrated with people who haven’t yet updated. I may say “The only question is how quickly people’s views shift from bear to tiger. Those who haven’t already shifted seem like they are systematically slow on the draw and we should learn from their mistakes.” I don’t think any of those statements imply I think there’s a 99.9% chance that it’s a tiger. It’s more a statement rejecting the reasons why people think there is a bear, and disagreeing with those reasons, and expecting their views to predictably change over time. But I could say all that while still acknowledging some chance that the tiger is a hoax, that there is a new species of animal that’s kind of like a tiger, that the animal we saw in photos is different from the one that’s eating people, or whatever else. The exact smallness of the probability of “actually it wasn’t the tiger after all” is not central to my claim that it’s obvious or that people will come around.
I don’t think it’s central to this point, but I think 99% is a defensible estimate for many-worlds. I would probably go somewhat lower but certainly wouldn’t run victory laps about that or treat it as damning of someone’s character. The above is mostly a bad analogy explaining why I think it’s pretty reasonable to say things like Eliezer did even if your all-things-considered confidence was 99% or even lower.
To get a sense for what Eliezer finds frustrating and intends to critique, you can read If many-worlds had come first (which I find quite obnoxious). I think to the extent that he’s wrong it’s generally by mischaracterizing the alternative position and being obnoxious about it (e.g. misunderstanding the extent to which collapse is proposed as ontologically fundamental rather than an expression of agnosticism or a framework for talking about experiments, and by slightly misunderstanding what “ontologically fundamental collapse” would actually mean). I don’t think it has much to do with overconfidence directly, or speaks to the quality of Eliezer’s reasoning about the physical world, though I think it is a bad recurring theme in Eliezer’s reasoning about and relationships with other humans. And in fairness I do think there are a lot of people who probably deserve Eliezer’s frustration on this point (e.g. who talk about how collapse is an important and poorly-understood phenomenon rather than most likely just being the most boring thing) though I mostly haven’t talked with them and I think they are systematically more mediocre physicists.
“Maybe (3) is a little unfair, or sounds harsher than I meant it. It’s a bit unclear to me how seriously to take Aaronson’s quote. It seems like plenty of physicists have looked through the sequences to find glaring flaws, and basically found none (physics stackexchange). T”
Here’s a couple: he conflates Copenhagen and Objective collapse throughout.
He fails to distinguish Everettian and Decoherence based MWI.
I agree that: Yudkowsky has an impressive understanding of physics for a layman, in some situations his understanding is on par with or exceeds some experts, and he has written explanations of technical topics that even some experts like and find impressive. This includes not just you, but also e.g. Scott Aaronson, who praised his series on QM in the same answer I excerpted above, calling it entertaining, enjoyable, and getting the technical stuff mostly right. He also praised it for its conceptual goals. I don’t believe this is faint praise, especially given stereotypes of amateurs writing about physics. This is a positive part of Yudkowsky’s track record. I think my comment sounds more negative about Yudkowsky’s QM sequence than it deserves, so thanks for pushing back on that.
I’m not sure what you mean when you call yourself a pro-MWI extremist but in any case AFAIK there are physicists, including one or more prominent ones, who think MWI is really the only explanation that makes sense, although there are obviously degrees in how fervently one can hold this position and Yudkowsky seems at the extreme end of the scale in some of his writings. And he is far from the only one who thinks Copenhagen is ridiculous. These two parts of Yudkowsky’s position on MWI are not without parallel within professional physicists, and the point about Copenhagen being ridiculous is probably a point in his favor from most views (e.g. Nobel laureate Murray Gell-Mann said that Neils Bohr brainwashed people into Copenhagen), let alone this community. Perhaps I should have clarified this in my comment, although I did say that MWI is a leading interpretation and may well be correct.
The negative aspects I said in my comment were:
Yudkowsky’s confidence in MWI is disproportionate
Yudkowsky’s conviction that people who disagree with him are making elementary mistakes is disproportionate
These may come partly from a lack of knowledge or expertise
Maybe (3) is a little unfair, or sounds harsher than I meant it. It’s a bit unclear to me how seriously to take Aaronson’s quote. It seems like plenty of physicists have looked through the sequences to find glaring flaws, and basically found none (physics stackexchange). This is a nontrivial achievement in context. At the same time I expect most of the scrutiny has been to a relatively shallow level, partly because Yudkowsky is a polarizing writer. Aaronson is probably one of fairly few people who have deep technical expertise and have read the sequences with both enjoyment and a critical eye. Aaronson suggested a specific, technical flaw that may be partly responsible for Yudkowsky holding an extreme position with overconfidence and misunderstanding what people who disagree with him think. Probably this is a flaw Yudkowsky would not have made if he had worked with a professional physicist or something. But maybe Aaronson was just casually speculating and maybe this doesn’t matter too much. I don’t know. Possibly you are right to push back on the mixed states explanation.
I think (1) and (2) are well worth considering though. The argument here is not that his position is necessarily wrong or impossible, but that it is overconfident. I am not courageous enough to argue for this position to a physicist who holds some kind of extreme pro-MWI view, but I think this is a reasonable view and there’s a good chance (1) and (2) are correct. It also fits in Ben’s point 4 in the comment above: “Yudkowsky’s track record suggests a substantial bias toward dramatic and overconfident predictions.”
Hmm, I’m a bit confused where you’re coming from.
Suppose that the majority of eminent mathematicians believe 5+5=10, but a significant minority believes 5+5=11. Also, out of the people in the 5+5=10 camp, some say “5+5=10 and anyone who says otherwise is just totally wrong”, whereas other people said “I happen to believe that the balance of evidence is that 5+5=10, but my esteemed colleagues are reasonable people and have come to a different conclusion, so we 5+5=10 advocates should approach the issue with appropriate humility, not overconfidence.”
In this case, the fact of the matter is that 5+5=10. So in terms of who gets the most credit added to their track-record, the ranking is:
1st place: The ones who say “5+5=10 and anyone who says otherwise is just totally wrong”,
2nd place: The ones who say “I think 5+5=10, but one should be humble, not overconfident”,
3rd place: The ones who say “I think 5+5=11, but one should be humble, not overconfident”,
Last place: The ones who say “5+5=11 and anyone who says otherwise is just totally wrong.
Agree so far?
(See also: Bayes’s theorem, Brier score, etc.)
Back to the issue here. Yudkowsky is claiming “MWI, and anyone who says otherwise is a just totally wrong”. (And I agree—that’s what I meant when I called myself a pro-MWI extremist.)
IF the fact of the matter is that careful thought shows MWI to be unambiguously correct, then Yudkowsky (and I) get more credit for being more confident. Basically, he’s going all in and betting his reputation on MWI being right, and (in this scenario) he won the bet.
Conversely, IF the fact of the matter is that careful thought shows MWI to be not unambiguously correct, then Eliezer loses the maximum number of points. He staked his reputation on MWI being right, and (in this scenario) he lost the bet.
So that’s my model, and in my model “overconfidence” per se is not really a thing in this context. Instead we first have to take a stand on the object-level controversy. I happen to agree with Eliezer that careful thought shows MWI to be unambiguously correct, and given that, the more extreme his confidence in this (IMO correct) claim, the more credit he deserves.
I’m trying to make sense of why you’re bringing up “overconfidence” here. The only thing I can think of is that you think that maybe there is simply not enough information to figure out whether MWI is right or wrong (not even for even an ideal reasoner with a brain the size of Jupiter and a billion years to ponder the topic), and therefore saying “MWI is unambiguously correct” is “overconfident”? If that’s what you’re thinking, then my reply is: if “not enough information” were the actual fact of the matter about MWI, then we should criticize Yudkowsky first and foremost for being wrong, not for being overconfident.
As for your point (2), I forget what mistakes Yudkowsky claimed that anti-MWI-advocates are making, and in particular whether he thought those mistakes were “elementary”. I am open-minded to the possibility that Yudkowsky was straw-manning the MWI critics, and that they are wrong for more interesting and subtle reasons than he gives them credit for, and in particular that he wouldn’t pass an anti-MWI ITT. (For my part, I’ve tried harder, see e.g. here.) But that’s a different topic. FWIW I don’t think of Yudkowsky as having a strong ability to explain people’s wrong opinions in a sympathetic and ITT-passing way, or if he does have that ability, then I find that he chooses not to exercise it too much in his writings. :-P
‘The more probability someone assigns to a claim, the more credit they get when the claim turns out to be true’ is true as a matter of Bayesian math. And I agree with you that MWI is true, and that we have enough evidence to say it’s true with very high confidence, if by ‘MWI’ we just mean a conjunction like “Objective collapse is false.” and “Quantum non-realism is false / the entire complex amplitude is in some important sense real”.
(I think Eliezer had a conjunction like this in mind when he talked about ‘MWI’ in the Sequences; he wasn’t claiming that decoherence explains the Born rule, and he certainly wasn’t claiming that we need to reify ‘worlds’ as a fundamental thing. I think a better term for MWI might be the ‘Much World Interpretation’, since the basic point is about how much stuff there is, not about a division of that stuff into discrete ‘worlds’.)
That said, I have no objection in principle to someone saying ‘Eliezer was right about MWI (and gets more points insofar as he was correct), but I also dock him more points than he gained because I think he was massively overconfident’.
E.g., imagine someone who assigns probability 1 (or probability .999999999) to a coin flip coming up heads. If the coin then comes up heads, then I’m going to either assume they were trolling me, or I’m going to infer that they’re very bad at reasoning. Even if they somehow rigged the coin, .999999999 is just too extreme a probability to be justified here.
By the same logic, if Eliezer had said that MWI is true with probability 1, or if he’d put too many ‘9s’ at the end of his .99… probability assignment, then I’d probably dock him more points than he gained for being object-level-correct. (Or I’d at least assume he has a terrible understanding of how Bayesian probability works. Someone could indeed be very miscalibrated and bad at talking in probabilistic terms, and yet be very knowledgeable and correct on object-level questions like MWI.)
I’m not sure exactly how many 9s is too many in the case of MWI, but it’s obviously possible to have too many 9s here. E.g., a hundred 9s would be too many! So I think this objection can make sense; I just don’t think Eliezer is in fact overconfident about MWI.
Fair enough, thanks.
Here’s my point: There is a rational limit to the amount of confidence one can have in MWI (or any belief). I don’t know where exactly this limit is for MWI-extremism but Yudkowsky clearly exceeded it sometimes. To use made up numbers, suppose:
MWI is objectively correct
Eliezer says P(MWI is correct) = 0.9999999
But rationally one can only reach P(MWI) = 0.999
Because there are remaining uncertainties that cannot be eliminated through superior thinking and careful consideration, such lack of experimental evidence, the possibility of QM getting overturned, the possibility of a new and better interpretation in the future, and unknown unknowns.
These factors add up to at least P(Not MWI) = 0.001.
Then even though Eliezer is correct about MWI being correct, he is still significantly overconfident in his belief about it.
Consider Paul’s example of Eliezer saying MWI is comparable to heliocentrism:
I agree with Paul here. Heliocentrism is vastly more likely than any particular interpretation of quantum mechanics, and Eliezer was wrong to have made this comparison.
This may sound like I’m nitpicking, but I think it fits into a pattern of Eliezer making dramatic and overconfident pronouncements, and it’s relevant information for people to consider e.g. when evaluating Eliezer’s belief that p(doom) = ~1 and the AI safety situation is so hopeless that the only thing left is to die with slightly more dignity.
Of course, it’s far from the only relevant data point.
Regarding (2), I think we’re on the same page haha.
Could someone point to the actual quotes where Eliezer compares heliocentrism to MWI? I don’t generally assume that when people are ‘comparing’ two very-high-probability things, they’re saying they have the same probability. Among other things, I’d want confirmation that ‘Eliezer and Paul assign roughly the same probability to MWI, but they have different probability thresholds for comparing things to heliocentrism’ is false.
E.g., if I compare Flat Earther beliefs, beliefs in psychic powers, belief ‘AGI was secretly invented in the year 2000’, geocentrism, homeopathy, and theism to each other, it doesn’t follow that I’d assign the same probabilities to all of those six claims, or even probabilities that are within six orders of magnitude of each other.
In some contexts it might indeed Griceanly imply that all six of those things pass my threshold for ‘unlikely enough that I’m happy to call them all laughably silly views’, but different people have their threshold for that kind of thing in different places.
Gotcha, thanks. I guess we have an object-level disagreement: I think that careful thought reveals MWI to be unambiguously correct, with enough 9’s as to justify Eliezer’s tone. And you don’t. ¯\_(ツ)_/¯
(Of course, this is bound to be a judgment call; e.g. Eliezer didn’t state how many 9’s of confidence he has. It’s not like there’s a universal convention for how many 9’s are enough 9’s to state something as a fact without hedging, or how many 9’s are enough 9’s to mock the people who disagree with you.)
Yes, agreed.
Let me lay out my thinking in more detail. I mean this to explain my views in more detail, not as an attempt to persuade.
Paul’s account of Aaronson’s view says that Eliezer shouldn’t be as confident in MWI as he is, which in words sounds exactly like my point, and similar to Aaronson’s stack exchange answer. But it still leaves open the question of how overconfident he was, and what, if anything, should be taken away from this. It’s possible that there’s a version of my point which is true but is also uninteresting or trivial (who cares if Yudkowsky was 10% too confident about MWI 15 years ago?).
And it’s worth reiterating that a lot of people give Eliezer credit for his writing on QM, including for being forceful in his views. I have no desire to argue against this. I had hoped to sidestep discussing this entirely since I consider it to be a separate point, but perhaps this was unfair and led to miscommunication. If someone wants to write a detailed comment/post explaining why Yudkowsky deserves a lot of credit for his QM writing, including credit for how forceful he was at times, I would be happy to read it and would likely upvote/strong upvote it depending on quality.
However, here my intention was to focus on the overconfidence aspect.
I’ll explain what I see as the epistemic mistakes Eliezer likely made to end up in an overconfident state. Why do I think Eliezer was overconfident on MWI?
(Some of the following may be wrong.)
He didn’t understand non-MWI-extremist views, which should have rationally limited his confidence
I don’t have sources for this, but I think something like this is true.
This was an avoidable mistake
Worth noting that Eliezer has updated towards the competence of elites in science since some of his early writing according to Rob’s comment elsewhere this thread
It’s possible that his technical understanding was uneven. This should also have limited his confidence.
Aaronson praised him for “actually get most of the technical stuff right”, which of course implies that not everything technical was correct.
He also suggested a specific, technical flaw in Yudkowsky’s understanding.
One big problem with having extreme conclusions based on uneven technical understanding is that you don’t know what you don’t know. And in fact Aaronson suggests a mistake Yudkowsky seems unaware of as a reason why Yudkowsky’s central argument is overstated/why Yudkowsky is overconfident about MWI.
However, it’s unclear how true/important a point this really is
At least 4 points limit confidence in P(MWI) to some degree:
Lack of experimental evidence
The possibility of QM getting overturned
The possibility of a new and better interpretation in the future
Unknown unknowns
I believe most or all of these are valid, commonly brought up points that together limit how confident anyone can be in P(MWI). Reasonable people may disagree with their weighting of course.
I am skeptical that Eliezer correctly accounted for these factors
Note that these are all points about the epistemic position Eliezer was in, not about the correctness of MWI. The first two are particular to him, and the last one applies to everyone.
Now, Rob points out that maybe the heliocentrism example is lacking context in some way (I find it a very compelling example of a super overconfident mistake if it’s not). Personally I think there are at least a couple[1] [2] of places in the sequences where Yudkowsky clearly says something that I think indicates ridiculous overconfidence tied to epistemic mistakes, but to be honest I’m not excited to argue about whether some of his language 15 years ago was or wasn’t overzealous.
The reason I brought this up despite it being a pretty minor point is because I think it’s part of a general pattern of Eliezer being overconfident in his views and overstating them. I am curious how much people actually disagree with this.
Of course, whether Eliezer has a tendency to be overconfident and overstate his views is only one small data point among very many others in evaluating p(doom), the value of listening to Eliezer’s views, etc.
“Many-worlds is an obvious fact, if you have all your marbles lined up correctly (understand very basic quantum physics, know the formal probability theory of Occam’s Razor, understand Special Relativity, etc.)”
“The only question now is how long it will take for the people of this world to update.” Both quotes from https://www.lesswrong.com/s/Kqs6GR7F5xziuSyGZ/p/S8ysHqeRGuySPttrS
For what it’s worth, consider the claim “The Judeo-Christian God, the one who listens to prayers and so on, doesn’t exist.” I have such high confidence in this claim that I would absolutely state it as a fact without hedging, and psychoanalyze people for how they came to disagree with me. Yet there’s a massive theology literature arguing to the contrary of that claim, including by some very smart and thoughtful people, and I’ve read essentially none of this theology literature, and if you asked me to do an anti-atheism ITT I would flunk it catastrophically.
I’m not sure what lesson you’ll take from that; for all I know you yourself are very religious, and this anecdote will convince you that I have terrible judgment. But if you happen to be on the same page as me, then maybe this would be an illustration of the fact that (I claim) one can rationally and correctly arrive at extremely-confident beliefs without it needing to pass through a deep understanding and engagement with the perspectives of the people who disagree with you.
I agree that this isn’t too important a conversation, it’s just kinda interesting. :)
I’m not sure either of the quotes you cited by Eliezer require or suggest ridiculous overconfidence.
If I’ve seen some photos of a tiger in town, and I know a bunch of people in town who got eaten by an animal, and we’ve all seen some apparent tiger-prints near where people got eaten, I may well say “it’s obvious there is a tiger in town eating people.” If people used to think it was a bear, but that belief was formed based on priors when we didn’t yet have any hard evidence about the tiger, I may be frustrated with people who haven’t yet updated. I may say “The only question is how quickly people’s views shift from bear to tiger. Those who haven’t already shifted seem like they are systematically slow on the draw and we should learn from their mistakes.” I don’t think any of those statements imply I think there’s a 99.9% chance that it’s a tiger. It’s more a statement rejecting the reasons why people think there is a bear, and disagreeing with those reasons, and expecting their views to predictably change over time. But I could say all that while still acknowledging some chance that the tiger is a hoax, that there is a new species of animal that’s kind of like a tiger, that the animal we saw in photos is different from the one that’s eating people, or whatever else. The exact smallness of the probability of “actually it wasn’t the tiger after all” is not central to my claim that it’s obvious or that people will come around.
I don’t think it’s central to this point, but I think 99% is a defensible estimate for many-worlds. I would probably go somewhat lower but certainly wouldn’t run victory laps about that or treat it as damning of someone’s character. The above is mostly a bad analogy explaining why I think it’s pretty reasonable to say things like Eliezer did even if your all-things-considered confidence was 99% or even lower.
To get a sense for what Eliezer finds frustrating and intends to critique, you can read If many-worlds had come first (which I find quite obnoxious). I think to the extent that he’s wrong it’s generally by mischaracterizing the alternative position and being obnoxious about it (e.g. misunderstanding the extent to which collapse is proposed as ontologically fundamental rather than an expression of agnosticism or a framework for talking about experiments, and by slightly misunderstanding what “ontologically fundamental collapse” would actually mean). I don’t think it has much to do with overconfidence directly, or speaks to the quality of Eliezer’s reasoning about the physical world, though I think it is a bad recurring theme in Eliezer’s reasoning about and relationships with other humans. And in fairness I do think there are a lot of people who probably deserve Eliezer’s frustration on this point (e.g. who talk about how collapse is an important and poorly-understood phenomenon rather than most likely just being the most boring thing) though I mostly haven’t talked with them and I think they are systematically more mediocre physicists.
“Maybe (3) is a little unfair, or sounds harsher than I meant it. It’s a bit unclear to me how seriously to take Aaronson’s quote. It seems like plenty of physicists have looked through the sequences to find glaring flaws, and basically found none (physics stackexchange). T”
Here’s a couple: he conflates Copenhagen and Objective collapse throughout.
He fails to distinguish Everettian and Decoherence based MWI.