I’m trying to make sense of why you’re bringing up “overconfidence” here. The only thing I can think of is that you think that maybe there is simply not enough information to figure out whether MWI is right or wrong (not even for even an ideal reasoner with a brain the size of Jupiter and a billion years to ponder the topic), and therefore saying “MWI is unambiguously correct” is “overconfident”?
Here’s my point: There is a rational limit to the amount of confidence one can have in MWI (or any belief). I don’t know where exactly this limit is for MWI-extremism but Yudkowsky clearly exceeded it sometimes. To use made up numbers, suppose:
MWI is objectively correct
Eliezer says P(MWI is correct) = 0.9999999
But rationally one can only reach P(MWI) = 0.999
Because there are remaining uncertainties that cannot be eliminated through superior thinking and careful consideration, such lack of experimental evidence, the possibility of QM getting overturned, the possibility of a new and better interpretation in the future, and unknown unknowns.
These factors add up to at least P(Not MWI) = 0.001.
Then even though Eliezer is correct about MWI being correct, he is still significantly overconfident in his belief about it.
Consider Paul’s example of Eliezer saying MWI is comparable to heliocentrism:
If we are deeply wrong about physics, then I [Paul Christiano] think this could go either way. And it still seems quite plausible that we are deeply wrong about physics in one way or another (even if not in any particular way). So I think it’s wrong to compare many-worlds to heliocentrism (as Eliezer has done). Heliocentrism is extraordinarily likely even if we are completely wrong about physics—direct observation of the solar system really is a much stronger form of evidence than a priori reasoning about the existence of other worlds.
I agree with Paul here. Heliocentrism is vastly more likely than any particular interpretation of quantum mechanics, and Eliezer was wrong to have made this comparison.
This may sound like I’m nitpicking, but I think it fits into a pattern of Eliezer making dramatic and overconfident pronouncements, and it’s relevant information for people to consider e.g. when evaluating Eliezer’s belief that p(doom) = ~1 and the AI safety situation is so hopeless that the only thing left is to die with slightly more dignity.
Of course, it’s far from the only relevant data point.
Regarding (2), I think we’re on the same page haha.
Could someone point to the actual quotes where Eliezer compares heliocentrism to MWI? I don’t generally assume that when people are ‘comparing’ two very-high-probability things, they’re saying they have the same probability. Among other things, I’d want confirmation that ‘Eliezer and Paul assign roughly the same probability to MWI, but they have different probability thresholds for comparing things to heliocentrism’ is false.
E.g., if I compare Flat Earther beliefs, beliefs in psychic powers, belief ‘AGI was secretly invented in the year 2000’, geocentrism, homeopathy, and theism to each other, it doesn’t follow that I’d assign the same probabilities to all of those six claims, or even probabilities that are within six orders of magnitude of each other.
In some contexts it might indeed Griceanly imply that all six of those things pass my threshold for ‘unlikely enough that I’m happy to call them all laughably silly views’, but different people have their threshold for that kind of thing in different places.
Gotcha, thanks. I guess we have an object-level disagreement: I think that careful thought reveals MWI to be unambiguously correct, with enough 9’s as to justify Eliezer’s tone. And you don’t. ¯\_(ツ)_/¯
(Of course, this is bound to be a judgment call; e.g. Eliezer didn’t state how many 9’s of confidence he has. It’s not like there’s a universal convention for how many 9’s are enough 9’s to state something as a fact without hedging, or how many 9’s are enough 9’s to mock the people who disagree with you.)
(Of course, this is bound to be a judgment call; e.g. Eliezer didn’t state how many 9’s of confidence he has. It’s not like there’s a universal convention for how many 9’s are enough 9’s to state something as a fact without hedging, or how many 9’s are enough 9’s to mock the people who disagree with you.)
Yes, agreed.
Let me lay out my thinking in more detail. I mean this to explain my views in more detail, not as an attempt to persuade.
Paul’s account of Aaronson’s view says that Eliezer shouldn’t be as confident in MWI as he is, which in words sounds exactly like my point, and similar to Aaronson’s stack exchange answer. But it still leaves open the question of how overconfident he was, and what, if anything, should be taken away from this. It’s possible that there’s a version of my point which is true but is also uninteresting or trivial (who cares if Yudkowsky was 10% too confident about MWI 15 years ago?).
And it’s worth reiterating that a lot of people give Eliezer credit for his writing on QM, including for being forceful in his views. I have no desire to argue against this. I had hoped to sidestep discussing this entirely since I consider it to be a separate point, but perhaps this was unfair and led to miscommunication. If someone wants to write a detailed comment/post explaining why Yudkowsky deserves a lot of credit for his QM writing, including credit for how forceful he was at times, I would be happy to read it and would likely upvote/strong upvote it depending on quality.
However, here my intention was to focus on the overconfidence aspect.
I’ll explain what I see as the epistemic mistakes Eliezer likely made to end up in an overconfident state. Why do I think Eliezer was overconfident on MWI?
(Some of the following may be wrong.)
He didn’t understand non-MWI-extremist views, which should have rationally limited his confidence
I don’t have sources for this, but I think something like this is true.
This was an avoidable mistake
Worth noting that Eliezer has updated towards the competence of elites in science since some of his early writing according to Rob’s comment elsewhere this thread
It’s possible that his technical understanding was uneven. This should also have limited his confidence.
Aaronson praised him for “actually get most of the technical stuff right”, which of course implies that not everything technical was correct.
He also suggested a specific, technical flaw in Yudkowsky’s understanding.
One big problem with having extreme conclusions based on uneven technical understanding is that you don’t know what you don’t know. And in fact Aaronson suggests a mistake Yudkowsky seems unaware of as a reason why Yudkowsky’s central argument is overstated/why Yudkowsky is overconfident about MWI.
However, it’s unclear how true/important a point this really is
At least 4 points limit confidence in P(MWI) to some degree:
Lack of experimental evidence
The possibility of QM getting overturned
The possibility of a new and better interpretation in the future
Unknown unknowns
I believe most or all of these are valid, commonly brought up points that together limit how confident anyone can be in P(MWI). Reasonable people may disagree with their weighting of course.
I am skeptical that Eliezer correctly accounted for these factors
Note that these are all points about the epistemic position Eliezer was in, not about the correctness of MWI. The first two are particular to him, and the last one applies to everyone.
Now, Rob points out that maybe the heliocentrism example is lacking context in some way (I find it a very compelling example of a super overconfident mistake if it’s not). Personally I think there are at least a couple[1][2] of places in the sequences where Yudkowsky clearly says something that I think indicates ridiculous overconfidence tied to epistemic mistakes, but to be honest I’m not excited to argue about whether some of his language 15 years ago was or wasn’t overzealous.
The reason I brought this up despite it being a pretty minor point is because I think it’s part of a general pattern of Eliezer being overconfident in his views and overstating them. I am curious how much people actually disagree with this.
Of course, whether Eliezer has a tendency to be overconfident and overstate his views is only one small data point among very many others in evaluating p(doom), the value of listening to Eliezer’s views, etc.
“Many-worlds is an obvious fact, if you have all your marbles lined up correctly (understand very basic quantum physics, know the formal probability theory of Occam’s Razor, understand Special Relativity, etc.)”
For what it’s worth, consider the claim “The Judeo-Christian God, the one who listens to prayers and so on, doesn’t exist.” I have such high confidence in this claim that I would absolutely state it as a fact without hedging, and psychoanalyze people for how they came to disagree with me. Yet there’s a massive theology literature arguing to the contrary of that claim, including by some very smart and thoughtful people, and I’ve read essentially none of this theology literature, and if you asked me to do an anti-atheism ITT I would flunk it catastrophically.
I’m not sure what lesson you’ll take from that; for all I know you yourself are very religious, and this anecdote will convince you that I have terrible judgment. But if you happen to be on the same page as me, then maybe this would be an illustration of the fact that (I claim) one can rationally and correctly arrive at extremely-confident beliefs without it needing to pass through a deep understanding and engagement with the perspectives of the people who disagree with you.
I agree that this isn’t too important a conversation, it’s just kinda interesting. :)
I’m not sure either of the quotes you cited by Eliezer require or suggest ridiculous overconfidence.
If I’ve seen some photos of a tiger in town, and I know a bunch of people in town who got eaten by an animal, and we’ve all seen some apparent tiger-prints near where people got eaten, I may well say “it’s obvious there is a tiger in town eating people.” If people used to think it was a bear, but that belief was formed based on priors when we didn’t yet have any hard evidence about the tiger, I may be frustrated with people who haven’t yet updated. I may say “The only question is how quickly people’s views shift from bear to tiger. Those who haven’t already shifted seem like they are systematically slow on the draw and we should learn from their mistakes.” I don’t think any of those statements imply I think there’s a 99.9% chance that it’s a tiger. It’s more a statement rejecting the reasons why people think there is a bear, and disagreeing with those reasons, and expecting their views to predictably change over time. But I could say all that while still acknowledging some chance that the tiger is a hoax, that there is a new species of animal that’s kind of like a tiger, that the animal we saw in photos is different from the one that’s eating people, or whatever else. The exact smallness of the probability of “actually it wasn’t the tiger after all” is not central to my claim that it’s obvious or that people will come around.
I don’t think it’s central to this point, but I think 99% is a defensible estimate for many-worlds. I would probably go somewhat lower but certainly wouldn’t run victory laps about that or treat it as damning of someone’s character. The above is mostly a bad analogy explaining why I think it’s pretty reasonable to say things like Eliezer did even if your all-things-considered confidence was 99% or even lower.
To get a sense for what Eliezer finds frustrating and intends to critique, you can read If many-worlds had come first (which I find quite obnoxious). I think to the extent that he’s wrong it’s generally by mischaracterizing the alternative position and being obnoxious about it (e.g. misunderstanding the extent to which collapse is proposed as ontologically fundamental rather than an expression of agnosticism or a framework for talking about experiments, and by slightly misunderstanding what “ontologically fundamental collapse” would actually mean). I don’t think it has much to do with overconfidence directly, or speaks to the quality of Eliezer’s reasoning about the physical world, though I think it is a bad recurring theme in Eliezer’s reasoning about and relationships with other humans. And in fairness I do think there are a lot of people who probably deserve Eliezer’s frustration on this point (e.g. who talk about how collapse is an important and poorly-understood phenomenon rather than most likely just being the most boring thing) though I mostly haven’t talked with them and I think they are systematically more mediocre physicists.
Here’s my point: There is a rational limit to the amount of confidence one can have in MWI (or any belief). I don’t know where exactly this limit is for MWI-extremism but Yudkowsky clearly exceeded it sometimes. To use made up numbers, suppose:
MWI is objectively correct
Eliezer says P(MWI is correct) = 0.9999999
But rationally one can only reach P(MWI) = 0.999
Because there are remaining uncertainties that cannot be eliminated through superior thinking and careful consideration, such lack of experimental evidence, the possibility of QM getting overturned, the possibility of a new and better interpretation in the future, and unknown unknowns.
These factors add up to at least P(Not MWI) = 0.001.
Then even though Eliezer is correct about MWI being correct, he is still significantly overconfident in his belief about it.
Consider Paul’s example of Eliezer saying MWI is comparable to heliocentrism:
I agree with Paul here. Heliocentrism is vastly more likely than any particular interpretation of quantum mechanics, and Eliezer was wrong to have made this comparison.
This may sound like I’m nitpicking, but I think it fits into a pattern of Eliezer making dramatic and overconfident pronouncements, and it’s relevant information for people to consider e.g. when evaluating Eliezer’s belief that p(doom) = ~1 and the AI safety situation is so hopeless that the only thing left is to die with slightly more dignity.
Of course, it’s far from the only relevant data point.
Regarding (2), I think we’re on the same page haha.
Could someone point to the actual quotes where Eliezer compares heliocentrism to MWI? I don’t generally assume that when people are ‘comparing’ two very-high-probability things, they’re saying they have the same probability. Among other things, I’d want confirmation that ‘Eliezer and Paul assign roughly the same probability to MWI, but they have different probability thresholds for comparing things to heliocentrism’ is false.
E.g., if I compare Flat Earther beliefs, beliefs in psychic powers, belief ‘AGI was secretly invented in the year 2000’, geocentrism, homeopathy, and theism to each other, it doesn’t follow that I’d assign the same probabilities to all of those six claims, or even probabilities that are within six orders of magnitude of each other.
In some contexts it might indeed Griceanly imply that all six of those things pass my threshold for ‘unlikely enough that I’m happy to call them all laughably silly views’, but different people have their threshold for that kind of thing in different places.
Gotcha, thanks. I guess we have an object-level disagreement: I think that careful thought reveals MWI to be unambiguously correct, with enough 9’s as to justify Eliezer’s tone. And you don’t. ¯\_(ツ)_/¯
(Of course, this is bound to be a judgment call; e.g. Eliezer didn’t state how many 9’s of confidence he has. It’s not like there’s a universal convention for how many 9’s are enough 9’s to state something as a fact without hedging, or how many 9’s are enough 9’s to mock the people who disagree with you.)
Yes, agreed.
Let me lay out my thinking in more detail. I mean this to explain my views in more detail, not as an attempt to persuade.
Paul’s account of Aaronson’s view says that Eliezer shouldn’t be as confident in MWI as he is, which in words sounds exactly like my point, and similar to Aaronson’s stack exchange answer. But it still leaves open the question of how overconfident he was, and what, if anything, should be taken away from this. It’s possible that there’s a version of my point which is true but is also uninteresting or trivial (who cares if Yudkowsky was 10% too confident about MWI 15 years ago?).
And it’s worth reiterating that a lot of people give Eliezer credit for his writing on QM, including for being forceful in his views. I have no desire to argue against this. I had hoped to sidestep discussing this entirely since I consider it to be a separate point, but perhaps this was unfair and led to miscommunication. If someone wants to write a detailed comment/post explaining why Yudkowsky deserves a lot of credit for his QM writing, including credit for how forceful he was at times, I would be happy to read it and would likely upvote/strong upvote it depending on quality.
However, here my intention was to focus on the overconfidence aspect.
I’ll explain what I see as the epistemic mistakes Eliezer likely made to end up in an overconfident state. Why do I think Eliezer was overconfident on MWI?
(Some of the following may be wrong.)
He didn’t understand non-MWI-extremist views, which should have rationally limited his confidence
I don’t have sources for this, but I think something like this is true.
This was an avoidable mistake
Worth noting that Eliezer has updated towards the competence of elites in science since some of his early writing according to Rob’s comment elsewhere this thread
It’s possible that his technical understanding was uneven. This should also have limited his confidence.
Aaronson praised him for “actually get most of the technical stuff right”, which of course implies that not everything technical was correct.
He also suggested a specific, technical flaw in Yudkowsky’s understanding.
One big problem with having extreme conclusions based on uneven technical understanding is that you don’t know what you don’t know. And in fact Aaronson suggests a mistake Yudkowsky seems unaware of as a reason why Yudkowsky’s central argument is overstated/why Yudkowsky is overconfident about MWI.
However, it’s unclear how true/important a point this really is
At least 4 points limit confidence in P(MWI) to some degree:
Lack of experimental evidence
The possibility of QM getting overturned
The possibility of a new and better interpretation in the future
Unknown unknowns
I believe most or all of these are valid, commonly brought up points that together limit how confident anyone can be in P(MWI). Reasonable people may disagree with their weighting of course.
I am skeptical that Eliezer correctly accounted for these factors
Note that these are all points about the epistemic position Eliezer was in, not about the correctness of MWI. The first two are particular to him, and the last one applies to everyone.
Now, Rob points out that maybe the heliocentrism example is lacking context in some way (I find it a very compelling example of a super overconfident mistake if it’s not). Personally I think there are at least a couple[1] [2] of places in the sequences where Yudkowsky clearly says something that I think indicates ridiculous overconfidence tied to epistemic mistakes, but to be honest I’m not excited to argue about whether some of his language 15 years ago was or wasn’t overzealous.
The reason I brought this up despite it being a pretty minor point is because I think it’s part of a general pattern of Eliezer being overconfident in his views and overstating them. I am curious how much people actually disagree with this.
Of course, whether Eliezer has a tendency to be overconfident and overstate his views is only one small data point among very many others in evaluating p(doom), the value of listening to Eliezer’s views, etc.
“Many-worlds is an obvious fact, if you have all your marbles lined up correctly (understand very basic quantum physics, know the formal probability theory of Occam’s Razor, understand Special Relativity, etc.)”
“The only question now is how long it will take for the people of this world to update.” Both quotes from https://www.lesswrong.com/s/Kqs6GR7F5xziuSyGZ/p/S8ysHqeRGuySPttrS
For what it’s worth, consider the claim “The Judeo-Christian God, the one who listens to prayers and so on, doesn’t exist.” I have such high confidence in this claim that I would absolutely state it as a fact without hedging, and psychoanalyze people for how they came to disagree with me. Yet there’s a massive theology literature arguing to the contrary of that claim, including by some very smart and thoughtful people, and I’ve read essentially none of this theology literature, and if you asked me to do an anti-atheism ITT I would flunk it catastrophically.
I’m not sure what lesson you’ll take from that; for all I know you yourself are very religious, and this anecdote will convince you that I have terrible judgment. But if you happen to be on the same page as me, then maybe this would be an illustration of the fact that (I claim) one can rationally and correctly arrive at extremely-confident beliefs without it needing to pass through a deep understanding and engagement with the perspectives of the people who disagree with you.
I agree that this isn’t too important a conversation, it’s just kinda interesting. :)
I’m not sure either of the quotes you cited by Eliezer require or suggest ridiculous overconfidence.
If I’ve seen some photos of a tiger in town, and I know a bunch of people in town who got eaten by an animal, and we’ve all seen some apparent tiger-prints near where people got eaten, I may well say “it’s obvious there is a tiger in town eating people.” If people used to think it was a bear, but that belief was formed based on priors when we didn’t yet have any hard evidence about the tiger, I may be frustrated with people who haven’t yet updated. I may say “The only question is how quickly people’s views shift from bear to tiger. Those who haven’t already shifted seem like they are systematically slow on the draw and we should learn from their mistakes.” I don’t think any of those statements imply I think there’s a 99.9% chance that it’s a tiger. It’s more a statement rejecting the reasons why people think there is a bear, and disagreeing with those reasons, and expecting their views to predictably change over time. But I could say all that while still acknowledging some chance that the tiger is a hoax, that there is a new species of animal that’s kind of like a tiger, that the animal we saw in photos is different from the one that’s eating people, or whatever else. The exact smallness of the probability of “actually it wasn’t the tiger after all” is not central to my claim that it’s obvious or that people will come around.
I don’t think it’s central to this point, but I think 99% is a defensible estimate for many-worlds. I would probably go somewhat lower but certainly wouldn’t run victory laps about that or treat it as damning of someone’s character. The above is mostly a bad analogy explaining why I think it’s pretty reasonable to say things like Eliezer did even if your all-things-considered confidence was 99% or even lower.
To get a sense for what Eliezer finds frustrating and intends to critique, you can read If many-worlds had come first (which I find quite obnoxious). I think to the extent that he’s wrong it’s generally by mischaracterizing the alternative position and being obnoxious about it (e.g. misunderstanding the extent to which collapse is proposed as ontologically fundamental rather than an expression of agnosticism or a framework for talking about experiments, and by slightly misunderstanding what “ontologically fundamental collapse” would actually mean). I don’t think it has much to do with overconfidence directly, or speaks to the quality of Eliezer’s reasoning about the physical world, though I think it is a bad recurring theme in Eliezer’s reasoning about and relationships with other humans. And in fairness I do think there are a lot of people who probably deserve Eliezer’s frustration on this point (e.g. who talk about how collapse is an important and poorly-understood phenomenon rather than most likely just being the most boring thing) though I mostly haven’t talked with them and I think they are systematically more mediocre physicists.