Iāve had some time to think about this post and itās reception both here and on LessWrong. Thereās a lot of discussion about the object-level claims and I donāt think I have too much to say about adjudicating them above whatās been said already, so I wonāt. Instead, I want to look at why this post is important at all.
1: Why does it matter if someone is wrong, frequently or egregiously?
I think this post thinks that its thesis matters because of the reach of Eliezerās influence on the rationalist and EA communities. It certainly seems historically true given Eliezerās position as one of the key founders of the Rationalist movement, but I donāt know how strong it is now, or how that question could be operationalised in a way where people could change their minds about it.
If you think Eliezer believes some set of beliefs X that are āegregiously wrongā then itās probably worth writing separate posts about those issues rather than a hit piece. If you think that the issue is dangerous community epistemics surrounding Eliezer, then itād probably be better if you focused on establishing that before bringing up the object level, or even bringing up the object level at all.
This has been a theme of quite a few posts recently (i.e. last year or so) on the Forum, but I think Iād like to see some more thoughts explaining what people mean by ādeferenceā or āepistemic normsā, and ideally some more concrete evidence about them being good or bad beyond personal anecdotes/āvibes at an EAG.
2: Did it need to be said in this way?
Ironically, a lot of what Omnizoid critcises Eliezer for is stuff I find myself thinking about Omnizoid takes some of the times! I definitely think this post could have had a better light-to-heat ratio if it was worded and structured differently, and I think itās to your credit Omni that you recongised this, but bad that you posted it in its original state on both Forums.
3: Why is Eliezer so confident?
Iāve never met Eliezer or interacted in the same social circles, so I donāt know to what extent personality figures into it. I Eliezer most clearly argues for this approach in his book Inadequate Equilibria he argues against what he calls āmodest epistemologyā (see Chapter 6), I think heād rather believe strongly and update accordingly[1] if proven wrong than slowly approach the right belief via gradient descent. This would explain why heās confident when heās both right and both wrong.
4: Why is the EA/āLW reaction so different?
So itās reaction is definitely more āpositiveā on the EA Forum than LessWrong. But I wouldnāt say itās āpositiveā (57 karma, 106 votes, 116 comments) at time of writing. Thatās decidedly āmixedā at best, so I donāt think you can view this post as āEA Forum says yeah screw Eliezerā and LessWrong says āboo stupid EA Forumā, I think both communityās views are more nuanced than that.
I do get a sense that while many on LW disagree with Eliezer on a lot, everyone there respects him and his accomplishments, whereas there is an EAF contigent that really doesnāt like Eliezer or his positions/āvibes, and are happy to see him ātaken down a peg or twoā. I think thereās a section of LW that doesnāt like this section of EA, hence Eliezerās claim about the ādownfall of EAā in his response.[2]
5: An attempted innoculation?
I think this is again related to the perception of outsiders of EA. Lots of our critics, fairly or unfairly, hone in on Eliezer and his views that are more confident/āoutside the overton window and run wit those to tarnish both EA and rationalism. Maybe this post is attempting to show internally and externally that Eliezer isnāt/āshouldnāt be highly respected in the community to innoculate against this criticism, but Iām not sure it does that well.
I think having this meta-level discussion about what discussion weāre actually having, or want to have, helps move us in the light-not-heat direction. All in all, I think the better discussion is to try and find measure on Eliezerās. I think my main point is 1) - I think there are some good object-level discussions, and itās worth being wary of the confidence the community places in primary figures, but on balance on reflection I donāt think this was the right way to go about it.
Thanks for this comment. I agree with 2. On 3, it seems flatly irrational to have super high credences when experts disagree with you and you do not have any special insights.
If an influential person who is given lots of deference is often wrong, that seems notable. If people were largely influenced by my blog, and I was often full of shit, expressing confident views on things I didnāt know about, that would be noteworthy.
Agree with 4.
On 5, I wasnāt intending to criticize EA or rationalism. Iām a bit lukewarm on rationalism, but enthusiastically pro EA, and have, in fact, written lengthy responses to many of the critics of EA. Really my aim was to show that Eliezer is worthy of much less deference then he currently is given, and to argue the object levelāthat many of his view, commonly believed in the community, are badly mistaken.
I guess on #3, I suggest reading Inadequate Equilibria. I think itās given me more insight into Eliezerās approach to making claims. The Bank of Japan example he uses in the book is probably, ironically, one of the clearest examples of an uncorrect, egregious and overconfident mistake. I think the question of when to trust your own judgement over experts, of how much to incorporate expert views into your own, and how to identify experts in the first place is an open and unsolved issue (perhaps insoluble?).
Point taken on #5, was definitely my most speculative point.
I think it comes back to Point #1 for me. If your core aim was: āto show that Eliezer is worthy of much less deference then he currently is givenā then Iād want you to show how much deference is given to him over and above the validity of his ideas spreading in the community, its mechanisms, and why thatās a potential issue more than litigating individual object-level cases. Instead, if your issue is the commonly-believed views in the community that you think are incorrect, then you could have argued against those beliefs without necessarily invoking or focusing on Eliezer. In a way the post suffers from kinda trying to be both of those critiques at once, at least in my opinion. Thatās at least the feedback Iād give if you wanted to revisit this issue (or a similar one) in the future.
Iāve had some time to think about this post and itās reception both here and on LessWrong. Thereās a lot of discussion about the object-level claims and I donāt think I have too much to say about adjudicating them above whatās been said already, so I wonāt. Instead, I want to look at why this post is important at all.
1: Why does it matter if someone is wrong, frequently or egregiously?
I think this post thinks that its thesis matters because of the reach of Eliezerās influence on the rationalist and EA communities. It certainly seems historically true given Eliezerās position as one of the key founders of the Rationalist movement, but I donāt know how strong it is now, or how that question could be operationalised in a way where people could change their minds about it.
If you think Eliezer believes some set of beliefs X that are āegregiously wrongā then itās probably worth writing separate posts about those issues rather than a hit piece. If you think that the issue is dangerous community epistemics surrounding Eliezer, then itād probably be better if you focused on establishing that before bringing up the object level, or even bringing up the object level at all.
This has been a theme of quite a few posts recently (i.e. last year or so) on the Forum, but I think Iād like to see some more thoughts explaining what people mean by ādeferenceā or āepistemic normsā, and ideally some more concrete evidence about them being good or bad beyond personal anecdotes/āvibes at an EAG.
2: Did it need to be said in this way?
Ironically, a lot of what Omnizoid critcises Eliezer for is stuff I find myself thinking about Omnizoid takes some of the times! I definitely think this post could have had a better light-to-heat ratio if it was worded and structured differently, and I think itās to your credit Omni that you recongised this, but bad that you posted it in its original state on both Forums.
3: Why is Eliezer so confident?
Iāve never met Eliezer or interacted in the same social circles, so I donāt know to what extent personality figures into it. I Eliezer most clearly argues for this approach in his book Inadequate Equilibria he argues against what he calls āmodest epistemologyā (see Chapter 6), I think heād rather believe strongly and update accordingly[1] if proven wrong than slowly approach the right belief via gradient descent. This would explain why heās confident when heās both right and both wrong.
4: Why is the EA/āLW reaction so different?
So itās reaction is definitely more āpositiveā on the EA Forum than LessWrong. But I wouldnāt say itās āpositiveā (57 karma, 106 votes, 116 comments) at time of writing. Thatās decidedly āmixedā at best, so I donāt think you can view this post as āEA Forum says yeah screw Eliezerā and LessWrong says āboo stupid EA Forumā, I think both communityās views are more nuanced than that.
I do get a sense that while many on LW disagree with Eliezer on a lot, everyone there respects him and his accomplishments, whereas there is an EAF contigent that really doesnāt like Eliezer or his positions/āvibes, and are happy to see him ātaken down a peg or twoā. I think thereās a section of LW that doesnāt like this section of EA, hence Eliezerās claim about the ādownfall of EAā in his response.[2]
5: An attempted innoculation?
I think this is again related to the perception of outsiders of EA. Lots of our critics, fairly or unfairly, hone in on Eliezer and his views that are more confident/āoutside the overton window and run wit those to tarnish both EA and rationalism. Maybe this post is attempting to show internally and externally that Eliezer isnāt/āshouldnāt be highly respected in the community to innoculate against this criticism, but Iām not sure it does that well.
I think having this meta-level discussion about what discussion weāre actually having, or want to have, helps move us in the light-not-heat direction. All in all, I think the better discussion is to try and find measure on Eliezerās. I think my main point is 1) - I think there are some good object-level discussions, and itās worth being wary of the confidence the community places in primary figures, but on balance on reflection I donāt think this was the right way to go about it.
Again, not making a claim on whether he does or not
Epistemic statusāreading vibes, very unconfident
Thanks for this comment. I agree with 2. On 3, it seems flatly irrational to have super high credences when experts disagree with you and you do not have any special insights.
If an influential person who is given lots of deference is often wrong, that seems notable. If people were largely influenced by my blog, and I was often full of shit, expressing confident views on things I didnāt know about, that would be noteworthy.
Agree with 4.
On 5, I wasnāt intending to criticize EA or rationalism. Iām a bit lukewarm on rationalism, but enthusiastically pro EA, and have, in fact, written lengthy responses to many of the critics of EA. Really my aim was to show that Eliezer is worthy of much less deference then he currently is given, and to argue the object levelāthat many of his view, commonly believed in the community, are badly mistaken.
I guess on #3, I suggest reading Inadequate Equilibria. I think itās given me more insight into Eliezerās approach to making claims. The Bank of Japan example he uses in the book is probably, ironically, one of the clearest examples of an uncorrect, egregious and overconfident mistake. I think the question of when to trust your own judgement over experts, of how much to incorporate expert views into your own, and how to identify experts in the first place is an open and unsolved issue (perhaps insoluble?).
Point taken on #5, was definitely my most speculative point.
I think it comes back to Point #1 for me. If your core aim was: āto show that Eliezer is worthy of much less deference then he currently is givenā then Iād want you to show how much deference is given to him over and above the validity of his ideas spreading in the community, its mechanisms, and why thatās a potential issue more than litigating individual object-level cases. Instead, if your issue is the commonly-believed views in the community that you think are incorrect, then you could have argued against those beliefs without necessarily invoking or focusing on Eliezer. In a way the post suffers from kinda trying to be both of those critiques at once, at least in my opinion. Thatās at least the feedback Iād give if you wanted to revisit this issue (or a similar one) in the future.