I’ve had some time to think about this post and it’s reception both here and on LessWrong. There’s a lot of discussion about the object-level claims and I don’t think I have too much to say about adjudicating them above what’s been said already, so I won’t. Instead, I want to look at why this post is important at all.
1: Why does it matter if someone is wrong, frequently or egregiously?
I think this post thinks that its thesis matters because of the reach of Eliezer’s influence on the rationalist and EA communities. It certainly seems historically true given Eliezer’s position as one of the key founders of the Rationalist movement, but I don’t know how strong it is now, or how that question could be operationalised in a way where people could change their minds about it.
If you think Eliezer believes some set of beliefs X that are ‘egregiously wrong’ then it’s probably worth writing separate posts about those issues rather than a hit piece. If you think that the issue is dangerous community epistemics surrounding Eliezer, then it’d probably be better if you focused on establishing that before bringing up the object level, or even bringing up the object level at all.
This has been a theme of quite a few posts recently (i.e. last year or so) on the Forum, but I think I’d like to see some more thoughts explaining what people mean by ‘deference’ or ‘epistemic norms’, and ideally some more concrete evidence about them being good or bad beyond personal anecdotes/vibes at an EAG.
2: Did it need to be said in this way?
Ironically, a lot of what Omnizoid critcises Eliezer for is stuff I find myself thinking about Omnizoid takes some of the times! I definitely think this post could have had a better light-to-heat ratio if it was worded and structured differently, and I think it’s to your credit Omni that you recongised this, but bad that you posted it in its original state on both Forums.
3: Why is Eliezer so confident?
I’ve never met Eliezer or interacted in the same social circles, so I don’t know to what extent personality figures into it. I Eliezer most clearly argues for this approach in his book Inadequate Equilibria he argues against what he calls ‘modest epistemology’ (see Chapter 6), I think he’d rather believe strongly and update accordingly[1] if proven wrong than slowly approach the right belief via gradient descent. This would explain why he’s confident when he’s both right and both wrong.
4: Why is the EA/LW reaction so different?
So it’s reaction is definitely more ‘positive’ on the EA Forum than LessWrong. But I wouldn’t say it’s ‘positive’ (57 karma, 106 votes, 116 comments) at time of writing. That’s decidedly ‘mixed’ at best, so I don’t think you can view this post as ‘EA Forum says yeah screw Eliezer’ and LessWrong says ‘boo stupid EA Forum’, I think both community’s views are more nuanced than that.
I do get a sense that while many on LW disagree with Eliezer on a lot, everyone there respects him and his accomplishments, whereas there is an EAF contigent that really doesn’t like Eliezer or his positions/vibes, and are happy to see him ‘taken down a peg or two’. I think there’s a section of LW that doesn’t like this section of EA, hence Eliezer’s claim about the ‘downfall of EA’ in his response.[2]
5: An attempted innoculation?
I think this is again related to the perception of outsiders of EA. Lots of our critics, fairly or unfairly, hone in on Eliezer and his views that are more confident/outside the overton window and run wit those to tarnish both EA and rationalism. Maybe this post is attempting to show internally and externally that Eliezer isn’t/shouldn’t be highly respected in the community to innoculate against this criticism, but I’m not sure it does that well.
I think having this meta-level discussion about what discussion we’re actually having, or want to have, helps move us in the light-not-heat direction. All in all, I think the better discussion is to try and find measure on Eliezer’s. I think my main point is 1) - I think there are some good object-level discussions, and it’s worth being wary of the confidence the community places in primary figures, but on balance on reflection I don’t think this was the right way to go about it.
Thanks for this comment. I agree with 2. On 3, it seems flatly irrational to have super high credences when experts disagree with you and you do not have any special insights.
If an influential person who is given lots of deference is often wrong, that seems notable. If people were largely influenced by my blog, and I was often full of shit, expressing confident views on things I didn’t know about, that would be noteworthy.
Agree with 4.
On 5, I wasn’t intending to criticize EA or rationalism. I’m a bit lukewarm on rationalism, but enthusiastically pro EA, and have, in fact, written lengthy responses to many of the critics of EA. Really my aim was to show that Eliezer is worthy of much less deference then he currently is given, and to argue the object level—that many of his view, commonly believed in the community, are badly mistaken.
I guess on #3, I suggest reading Inadequate Equilibria. I think it’s given me more insight into Eliezer’s approach to making claims. The Bank of Japan example he uses in the book is probably, ironically, one of the clearest examples of an uncorrect, egregious and overconfident mistake. I think the question of when to trust your own judgement over experts, of how much to incorporate expert views into your own, and how to identify experts in the first place is an open and unsolved issue (perhaps insoluble?).
Point taken on #5, was definitely my most speculative point.
I think it comes back to Point #1 for me. If your core aim was: “to show that Eliezer is worthy of much less deference then he currently is given” then I’d want you to show how much deference is given to him over and above the validity of his ideas spreading in the community, its mechanisms, and why that’s a potential issue more than litigating individual object-level cases. Instead, if your issue is the commonly-believed views in the community that you think are incorrect, then you could have argued against those beliefs without necessarily invoking or focusing on Eliezer. In a way the post suffers from kinda trying to be both of those critiques at once, at least in my opinion. That’s at least the feedback I’d give if you wanted to revisit this issue (or a similar one) in the future.
I’ve had some time to think about this post and it’s reception both here and on LessWrong. There’s a lot of discussion about the object-level claims and I don’t think I have too much to say about adjudicating them above what’s been said already, so I won’t. Instead, I want to look at why this post is important at all.
1: Why does it matter if someone is wrong, frequently or egregiously?
I think this post thinks that its thesis matters because of the reach of Eliezer’s influence on the rationalist and EA communities. It certainly seems historically true given Eliezer’s position as one of the key founders of the Rationalist movement, but I don’t know how strong it is now, or how that question could be operationalised in a way where people could change their minds about it.
If you think Eliezer believes some set of beliefs X that are ‘egregiously wrong’ then it’s probably worth writing separate posts about those issues rather than a hit piece. If you think that the issue is dangerous community epistemics surrounding Eliezer, then it’d probably be better if you focused on establishing that before bringing up the object level, or even bringing up the object level at all.
This has been a theme of quite a few posts recently (i.e. last year or so) on the Forum, but I think I’d like to see some more thoughts explaining what people mean by ‘deference’ or ‘epistemic norms’, and ideally some more concrete evidence about them being good or bad beyond personal anecdotes/vibes at an EAG.
2: Did it need to be said in this way?
Ironically, a lot of what Omnizoid critcises Eliezer for is stuff I find myself thinking about Omnizoid takes some of the times! I definitely think this post could have had a better light-to-heat ratio if it was worded and structured differently, and I think it’s to your credit Omni that you recongised this, but bad that you posted it in its original state on both Forums.
3: Why is Eliezer so confident?
I’ve never met Eliezer or interacted in the same social circles, so I don’t know to what extent personality figures into it. I Eliezer most clearly argues for this approach in his book Inadequate Equilibria he argues against what he calls ‘modest epistemology’ (see Chapter 6), I think he’d rather believe strongly and update accordingly[1] if proven wrong than slowly approach the right belief via gradient descent. This would explain why he’s confident when he’s both right and both wrong.
4: Why is the EA/LW reaction so different?
So it’s reaction is definitely more ‘positive’ on the EA Forum than LessWrong. But I wouldn’t say it’s ‘positive’ (57 karma, 106 votes, 116 comments) at time of writing. That’s decidedly ‘mixed’ at best, so I don’t think you can view this post as ‘EA Forum says yeah screw Eliezer’ and LessWrong says ‘boo stupid EA Forum’, I think both community’s views are more nuanced than that.
I do get a sense that while many on LW disagree with Eliezer on a lot, everyone there respects him and his accomplishments, whereas there is an EAF contigent that really doesn’t like Eliezer or his positions/vibes, and are happy to see him ‘taken down a peg or two’. I think there’s a section of LW that doesn’t like this section of EA, hence Eliezer’s claim about the ‘downfall of EA’ in his response.[2]
5: An attempted innoculation?
I think this is again related to the perception of outsiders of EA. Lots of our critics, fairly or unfairly, hone in on Eliezer and his views that are more confident/outside the overton window and run wit those to tarnish both EA and rationalism. Maybe this post is attempting to show internally and externally that Eliezer isn’t/shouldn’t be highly respected in the community to innoculate against this criticism, but I’m not sure it does that well.
I think having this meta-level discussion about what discussion we’re actually having, or want to have, helps move us in the light-not-heat direction. All in all, I think the better discussion is to try and find measure on Eliezer’s. I think my main point is 1) - I think there are some good object-level discussions, and it’s worth being wary of the confidence the community places in primary figures, but on balance on reflection I don’t think this was the right way to go about it.
Again, not making a claim on whether he does or not
Epistemic status—reading vibes, very unconfident
Thanks for this comment. I agree with 2. On 3, it seems flatly irrational to have super high credences when experts disagree with you and you do not have any special insights.
If an influential person who is given lots of deference is often wrong, that seems notable. If people were largely influenced by my blog, and I was often full of shit, expressing confident views on things I didn’t know about, that would be noteworthy.
Agree with 4.
On 5, I wasn’t intending to criticize EA or rationalism. I’m a bit lukewarm on rationalism, but enthusiastically pro EA, and have, in fact, written lengthy responses to many of the critics of EA. Really my aim was to show that Eliezer is worthy of much less deference then he currently is given, and to argue the object level—that many of his view, commonly believed in the community, are badly mistaken.
I guess on #3, I suggest reading Inadequate Equilibria. I think it’s given me more insight into Eliezer’s approach to making claims. The Bank of Japan example he uses in the book is probably, ironically, one of the clearest examples of an uncorrect, egregious and overconfident mistake. I think the question of when to trust your own judgement over experts, of how much to incorporate expert views into your own, and how to identify experts in the first place is an open and unsolved issue (perhaps insoluble?).
Point taken on #5, was definitely my most speculative point.
I think it comes back to Point #1 for me. If your core aim was: “to show that Eliezer is worthy of much less deference then he currently is given” then I’d want you to show how much deference is given to him over and above the validity of his ideas spreading in the community, its mechanisms, and why that’s a potential issue more than litigating individual object-level cases. Instead, if your issue is the commonly-believed views in the community that you think are incorrect, then you could have argued against those beliefs without necessarily invoking or focusing on Eliezer. In a way the post suffers from kinda trying to be both of those critiques at once, at least in my opinion. That’s at least the feedback I’d give if you wanted to revisit this issue (or a similar one) in the future.