I don’t find this explanation convincing fwiw. Eliezer is an incredible case of hero-worship—it’s become the norm to just link to jargon he created as though it’s enough to settle an argument. The closest thing we have here is Will, and most EAs seem to favour him for his character rather than necessarily agreeing with his views—let alone linking to his posts like they were scripture.
Other than the two of them, I wouldn’t say there’s much discussion of personalities and individuals on either forum.
Eliezer is an incredible case of hero-worship—it’s become the norm to just link to jargon he created as though it’s enough to settle an argument.
I think that you misunderstand why people link to things.
If someone didn’t get why I feel morally obligated to help people who live in distant countries, I would likely link them to Singer’s drowning child thought experiment. Either during my explanation of how I feel, or in lieu of one if I were busy.
This is not because I hero-worship Singer. This is not because I think his posts are scripture. This is because I broadly agree with the specific thing he said which I am linking, and he put it well, and he put it first, and there isn’t a lot of point of duplicating that effort. If after reading you disagree, that’s fine, I can be convinced. The argument can continue as long as it doesn’t continue for reasons that are soundly refuted in the thing I just linked.
I link people to things pretty frequently in casual conversation. A lot of the time, I link them to something posted to the EA Forum or LessWrong. A lot of the time, it’s something written by Eliezer Yudkowsky. This isn’t because I hero-worship him, or that I think linking to something he said settles an argument—it’s because I broadly agree with the specific thing I’m linking and don’t see the point of duplicating effort. If after reading you disagree, that’s fine, I can be convinced. The argument can continue as long as it doesn’t continue for reasons that are soundly refuted in the thing I just linked.
There are a ton of people who I’d like to link to as frequently as I do Eliezer. But Eliezer wrote in short easily-digested essays, on the internet instead of as chapters in a paper book or pdf. He’s easy to link to, so he gets linked.
There’s a world of difference between the link-phrases ‘here’s an argument about why you should do x’ and ‘do x’. Only Eliezer seems to regularly merit the latter.
https://forum.effectivealtruism.org/posts/LvwihGYgFEzjGDhBt/?commentId=x5zqnevWR8MQHqqvd—link to Duncan Sabien, “I care about the lives we can save if we don’t rush to conclusions, rush to anger, if we can give each other the benefit of the doubt for five freaking minutes and consider whether it’d make any sense whatsoever for the accusation de jour to be what it looks like,” seems pretty darn ‘do x’-y. I don’t necessarily stand behind how strongly I came on there, I was in a pretty foul mood.
I think that mostly, this is just how people talk.
I am not making the stronger claim that there are zero people who hero-worship Eliezer Yudkowsky.
Also, I would add at the very least Gwern (which might be relevant to note regarding the current topic) and Scott Alexander as other two clear cases of “personalities” in LW
I agree that there are of course individual people that are trusted and that have a reputation within the community, but the frequency of conversations around Scott Alexander’s personality, or his reputation, or his net-effect on the world, is much rarer on LW than it is on the EA Forum, as far as I can tell.
Like, when was actually the last thread on LW about drama caused by a specific organization or individual? In my mind almost all of that tends to congregate on the EA Forum.
My guess is that you see this more in EA because the stakes are higher for EAs. There’s much more of a sense that people here are contributing to the establishment and continuation of a movement, the movement is often core to people’s identities (it’s why they do what they do, live where they live, etc), and ‘drama’ can have consequences on the progress of work people care a lot about. Few people are here just for the interesting ideas.
While LW does have a bit of a corresponding rationality movement I think it’s weaker or less central on all of these angles.
I think Jeff is right, but I would go so far to say the hero worship on LW is so strong that there’s also a selection effect—if you don’t find Eliezer and co convincing, you won’t spend time on a forum that treats them with such reverence (this at least is part of why I’ve never spent much time there, despite being a cold calculating Vulcan type).
Re drama around organisations, there are way more orgs which one might consider EA than which one might consider rationalist, so there’s just more available lightning rods.
It’s a plausible explanation! I do think even for Eliezer, I really don’t remember much discussion of like, him and his personality in the recent years. Do you have any links? (I can maybe remember something from like 7 years ago, but nothing since LW 2.0).
Overall, I think there are a bunch of other also kind of bad dynamics going on on LW, but I do genuinely think that there isn’t that much hero worship, or institution/personality-oriented drama.
I’m saying the people who view him negatively just tend to self-select out of LW. Those who remain might not bother to have substantive discussion—it’s just that the average mention of him seems ridiculously deferent/overzealous in describing his achievements (for example, I recently went to an EAGx talk which described him along with Tetlock as one of the two ‘fathers of forecasting’).
If you want to see negative discussion of him, that seems to be basically what RationalWiki and r/Sneerclub exist for.
Putting Habryka’s claim another way: If Eliezer right now was involved in a huge scandal like say SBF or Will Macaskill was, then I think modern LW would mostly handle it pretty fine. Not perfectly, but I wouldn’t expect nearly the amount of drama that EA’s getting. (Early LW from the 2000s or early 2010s would probably do worse, IMO.) My suspicion is that LW has way less personal drama over Eliezer than say, EA would over SBF or Nick Bostrom.
I think there are a few things going on here, not sure how many we’d disagree on. I claim:
Eliezer has direct influence over far fewer community-relevant organisations than Will does or SBF did (cf comment above that there exist far fewer such orgs for the rationalist community). Therefore a much smaller proportion of his actions are relevant to the LW community than Will’s are and SBF’s were to the EA community.
I don’t think there’s been a huge scandal involving Will? Sure, there are questions we’d like to see him openly address about what he could have done differently re FTX—and I personally am concerned about his aforementioned influence because I don’t want anyone to have that much—but very few if any people here seem to believe he’s done anything in seriously bad faith.
I think the a priori chance of a scandal involving Eliezer on LW is much lower than the chance of a scandal on here involving Will because of the selection effect I mentioned—the people on LW are selected more strongly for being willing to overlook his faults. The people who both have an interest in rationality and get scandalised by Bostrom/Eliezer hang out on Sneerclub, pretty much being scandalised by them all the time.
The culture on here seems more heterogenous than LW. Inasmuch as we’re more drama-prone, I would guess that’s the main reason why—there’s a broader range of viewpoints and events that will trigger a substantial proportion of the userbase.
So these theories support/explain why there might be more drama on here, but push back against the ‘no hero-worship/not personality-oriented’ claims, which both ring false to me. Overall, I also don’t think the lower drama on LW implies a healthier epistemic climate.
I don’t think there’s been a huge scandal involving Will? Sure, there are questions we’d like to see him openly address about what he could have done differently re FTX—and I personally am concerned about his aforementioned influence because I don’t want anyone to have that much—but very few if any people here seem to believe he’s done anything in seriously bad faith.
I was imagining a counterfactual world where William Macaskill did something hugely wrong.
And yeah come to think of it, selection may be quite a bit stronger than I think.
The bigger discussion from maybe 7 years ago that Habryka refers to was as far as my memories goes his April first post in 2014 about Dath Ilan. The resulting discussion was critical enough of EY that from that point on most of EY’s writing was published on Facebook/Twitter and not LessWrong anymore. One his Facebook feed he can simply ban people who he finds annoying but on LessWrong he couldn’t.
Izzat true? Aside from edited versions of other posts and cross-posts by the LW admins, I see zero EY posts on LW between mid-September 2013 and Aug 2016, versus 21 real posts earlier in 2013, 29 in 2012, 12 in 2011, 17 in 2010, and ~180 in 2009.
So I see a big drop-off after the Sequences ended in 2009, and a complete halt in Sep 2013. Though I guess if he’d mostly stopped posting to LW anyway and then had a negative experience when he poked his head back in, that could cement a decision to post less to LW.
(This is the first time I’m hearing that the post got deleted, I thought I saw it on LW more recently than that?)
2017 is when LW 2.0 launched, so 2014-2016 was also a nadir in the site’s quality and general activity.
As a person who went on LW several months ago, I think that Eliezer is a great thinker, but he does get things wrong quite a few times. He is not a perfect thinker or hero, but Eliezer was quite a bit better (arguably far better than most.)
I wouldn’t idolize him, but nor would I ignore Eliezer’s accomplishments.
This seems to fit with the fact that there wasn’t much appetite for the consequentialist argument against Bostrom until the term “information hazard” came up.
I don’t find this explanation convincing fwiw. Eliezer is an incredible case of hero-worship—it’s become the norm to just link to jargon he created as though it’s enough to settle an argument. The closest thing we have here is Will, and most EAs seem to favour him for his character rather than necessarily agreeing with his views—let alone linking to his posts like they were scripture.
Other than the two of them, I wouldn’t say there’s much discussion of personalities and individuals on either forum.
I think that you misunderstand why people link to things.
If someone didn’t get why I feel morally obligated to help people who live in distant countries, I would likely link them to Singer’s drowning child thought experiment. Either during my explanation of how I feel, or in lieu of one if I were busy.
This is not because I hero-worship Singer. This is not because I think his posts are scripture. This is because I broadly agree with the specific thing he said which I am linking, and he put it well, and he put it first, and there isn’t a lot of point of duplicating that effort. If after reading you disagree, that’s fine, I can be convinced. The argument can continue as long as it doesn’t continue for reasons that are soundly refuted in the thing I just linked.
I link people to things pretty frequently in casual conversation. A lot of the time, I link them to something posted to the EA Forum or LessWrong. A lot of the time, it’s something written by Eliezer Yudkowsky. This isn’t because I hero-worship him, or that I think linking to something he said settles an argument—it’s because I broadly agree with the specific thing I’m linking and don’t see the point of duplicating effort. If after reading you disagree, that’s fine, I can be convinced. The argument can continue as long as it doesn’t continue for reasons that are soundly refuted in the thing I just linked.
There are a ton of people who I’d like to link to as frequently as I do Eliezer. But Eliezer wrote in short easily-digested essays, on the internet instead of as chapters in a paper book or pdf. He’s easy to link to, so he gets linked.
There’s a world of difference between the link-phrases ‘here’s an argument about why you should do x’ and ‘do x’. Only Eliezer seems to regularly merit the latter.
Here are the last four things I remember seeing linked as supporting evidence in casual conversation on the EA forum, in no particular order:
https://forum.effectivealtruism.org/posts/LvwihGYgFEzjGDhBt/?commentId=HebnLpj2pqyctd72F—link to Scott Alexander, “We have to stop it with the pointless infighting or it’s all we will end up doing,” is ‘do x’-y if anything is. (It also sounds like a perfectly reasonable thing to say and a perfectly reasonable way to say it.)
https://forum.effectivealtruism.org/posts/LvwihGYgFEzjGDhBt/?commentId=SCfBodrdQYZBA6RBy—separate links to Scott Alexander and Eliezer Yudkowsky, neither of which seem very ‘do x’-y to me.
https://forum.effectivealtruism.org/posts/irhgjSgvocfrwnzRz/?commentId=NF9YQfrDGPcH6wYCb—link to Scott Alexander, seems somewhat though not extremely ‘do x’-y to me. Also seems like a perfectly reasonable thing to say and I stand by saying it.
https://forum.effectivealtruism.org/posts/LvwihGYgFEzjGDhBt/?commentId=x5zqnevWR8MQHqqvd—link to Duncan Sabien, “I care about the lives we can save if we don’t rush to conclusions, rush to anger, if we can give each other the benefit of the doubt for five freaking minutes and consider whether it’d make any sense whatsoever for the accusation de jour to be what it looks like,” seems pretty darn ‘do x’-y. I don’t necessarily stand behind how strongly I came on there, I was in a pretty foul mood.
I think that mostly, this is just how people talk.
I am not making the stronger claim that there are zero people who hero-worship Eliezer Yudkowsky.
Also, I would add at the very least Gwern (which might be relevant to note regarding the current topic) and Scott Alexander as other two clear cases of “personalities” in LW
I agree that there are of course individual people that are trusted and that have a reputation within the community, but the frequency of conversations around Scott Alexander’s personality, or his reputation, or his net-effect on the world, is much rarer on LW than it is on the EA Forum, as far as I can tell.
Like, when was actually the last thread on LW about drama caused by a specific organization or individual? In my mind almost all of that tends to congregate on the EA Forum.
My guess is that you see this more in EA because the stakes are higher for EAs. There’s much more of a sense that people here are contributing to the establishment and continuation of a movement, the movement is often core to people’s identities (it’s why they do what they do, live where they live, etc), and ‘drama’ can have consequences on the progress of work people care a lot about. Few people are here just for the interesting ideas.
While LW does have a bit of a corresponding rationality movement I think it’s weaker or less central on all of these angles.
Yep, I agree, that’s a big part of it.
I think Jeff is right, but I would go so far to say the hero worship on LW is so strong that there’s also a selection effect—if you don’t find Eliezer and co convincing, you won’t spend time on a forum that treats them with such reverence (this at least is part of why I’ve never spent much time there, despite being a cold calculating Vulcan type).
Re drama around organisations, there are way more orgs which one might consider EA than which one might consider rationalist, so there’s just more available lightning rods.
It’s a plausible explanation! I do think even for Eliezer, I really don’t remember much discussion of like, him and his personality in the recent years. Do you have any links? (I can maybe remember something from like 7 years ago, but nothing since LW 2.0).
Overall, I think there are a bunch of other also kind of bad dynamics going on on LW, but I do genuinely think that there isn’t that much hero worship, or institution/personality-oriented drama.
I’m saying the people who view him negatively just tend to self-select out of LW. Those who remain might not bother to have substantive discussion—it’s just that the average mention of him seems ridiculously deferent/overzealous in describing his achievements (for example, I recently went to an EAGx talk which described him along with Tetlock as one of the two ‘fathers of forecasting’).
If you want to see negative discussion of him, that seems to be basically what RationalWiki and r/Sneerclub exist for.
Putting Habryka’s claim another way: If Eliezer right now was involved in a huge scandal like say SBF or Will Macaskill was, then I think modern LW would mostly handle it pretty fine. Not perfectly, but I wouldn’t expect nearly the amount of drama that EA’s getting. (Early LW from the 2000s or early 2010s would probably do worse, IMO.) My suspicion is that LW has way less personal drama over Eliezer than say, EA would over SBF or Nick Bostrom.
I think there are a few things going on here, not sure how many we’d disagree on. I claim:
Eliezer has direct influence over far fewer community-relevant organisations than Will does or SBF did (cf comment above that there exist far fewer such orgs for the rationalist community). Therefore a much smaller proportion of his actions are relevant to the LW community than Will’s are and SBF’s were to the EA community.
I don’t think there’s been a huge scandal involving Will? Sure, there are questions we’d like to see him openly address about what he could have done differently re FTX—and I personally am concerned about his aforementioned influence because I don’t want anyone to have that much—but very few if any people here seem to believe he’s done anything in seriously bad faith.
I think the a priori chance of a scandal involving Eliezer on LW is much lower than the chance of a scandal on here involving Will because of the selection effect I mentioned—the people on LW are selected more strongly for being willing to overlook his faults. The people who both have an interest in rationality and get scandalised by Bostrom/Eliezer hang out on Sneerclub, pretty much being scandalised by them all the time.
The culture on here seems more heterogenous than LW. Inasmuch as we’re more drama-prone, I would guess that’s the main reason why—there’s a broader range of viewpoints and events that will trigger a substantial proportion of the userbase.
So these theories support/explain why there might be more drama on here, but push back against the ‘no hero-worship/not personality-oriented’ claims, which both ring false to me. Overall, I also don’t think the lower drama on LW implies a healthier epistemic climate.
I was imagining a counterfactual world where William Macaskill did something hugely wrong.
And yeah come to think of it, selection may be quite a bit stronger than I think.
The bigger discussion from maybe 7 years ago that Habryka refers to was as far as my memories goes his April first post in 2014 about Dath Ilan. The resulting discussion was critical enough of EY that from that point on most of EY’s writing was published on Facebook/Twitter and not LessWrong anymore. One his Facebook feed he can simply ban people who he finds annoying but on LessWrong he couldn’t.
Izzat true? Aside from edited versions of other posts and cross-posts by the LW admins, I see zero EY posts on LW between mid-September 2013 and Aug 2016, versus 21 real posts earlier in 2013, 29 in 2012, 12 in 2011, 17 in 2010, and ~180 in 2009.
So I see a big drop-off after the Sequences ended in 2009, and a complete halt in Sep 2013. Though I guess if he’d mostly stopped posting to LW anyway and then had a negative experience when he poked his head back in, that could cement a decision to post less to LW.
(This is the first time I’m hearing that the post got deleted, I thought I saw it on LW more recently than that?)
2017 is when LW 2.0 launched, so 2014-2016 was also a nadir in the site’s quality and general activity.
I was active at that time on LessWrong and mostly go after my memory and memories for something that happened eight years ago isn’t perfect.
https://yudkowsky.tumblr.com/post/81447230971/my-april-fools-day-confession was to my memory also posted to LessWrong and the LessWrong site of that post is deleted.
When doing a Google search for the timeframe on LessWrong, that doesn’t bring up any mention of Dath Ilan.
Is your memory that Dath Ilan was just never talked about on LessWrong when Eliezer wrote that post?
which post is this? I looked on EY’s LW profile but couldn’t see which one this was referring to. There’s this blog post https://yudkowsky.tumblr.com/post/81447230971/my-april-fools-day-confession but it’s not on LW. also, it looks like there’s been a lot of posts from EY on LW since 2014?
I think that’s the post. As far as my memory goes, the criticism led to Eliezer deleting it from LessWrong.
As a person who went on LW several months ago, I think that Eliezer is a great thinker, but he does get things wrong quite a few times. He is not a perfect thinker or hero, but Eliezer was quite a bit better (arguably far better than most.)
I wouldn’t idolize him, but nor would I ignore Eliezer’s accomplishments.
This seems to fit with the fact that there wasn’t much appetite for the consequentialist argument against Bostrom until the term “information hazard” came up.