Personally, I’m not a fan of LessWrong’s thinking style, writing style, or intellectual products. As such, I think EA would be better off with less LW influence in the near-medium term. However, I’m not familiar enough with EA’s intellectual history to judge how useful LW was to it; I certainly can’t predict EA’s intellectual future. It seems possible that future exchange would be useful, if only for viewpoint diversity. On balance though I’d lean against heavy exchange.
What do you dislike about the LW style? Can you provide more specifics? There’s a big range of authors that have been publishing on LW for several years now.
I can try! But apologies as this will be vague—there’ll be lots of authors this doesn’t apply to, and this is my gestalt impression given I avoid reading much of it. And as I say, I don’t know how beneficial LW was to EA’s development, so am not confident on how future exchange should go.
I tend to be frustrated by the general tendencies towards over-confidence, in-group jargon, and overrating the abilities or insights of their community/influences vs. others (esp. expert communities and traditional academic sources). Most references I see to ‘epistemics’ seem under-specified and not useful, and usually a short-hand way to dismiss a non-conforming view. I find it ironic that denigrating others’ ‘epistemics’ is a common LW refrain given my impression that the epistemic quality of LW seems poor.
There’s a kind of Gell-Mann amnesia effect I get where the LW discourse on things I know about decently well (medical advice, neuroscience, global health) I can easily see as wrong, poorly conceived and argued, and over-confident. I don’t have a clear personal view on the LW discourse on things I don’t know well like AI, but have occasionally seen ~similar takes to mine from some people who know AI well.
There are definitely writers/thinkers I admire from LW, but I usually admire them despite their LW-like tendencies. Losing their input would be a true loss. But for overall effect on EA, I doubt (with weak confidence) these exemplars outweigh the subpar majority.
Personally, I’m not a fan of LessWrong’s thinking style, writing style, or intellectual products. As such, I think EA would be better off with less LW influence in the near-medium term.
However, I’m not familiar enough with EA’s intellectual history to judge how useful LW was to it; I certainly can’t predict EA’s intellectual future. It seems possible that future exchange would be useful, if only for viewpoint diversity. On balance though I’d lean against heavy exchange.
What do you dislike about the LW style? Can you provide more specifics? There’s a big range of authors that have been publishing on LW for several years now.
I can try! But apologies as this will be vague—there’ll be lots of authors this doesn’t apply to, and this is my gestalt impression given I avoid reading much of it. And as I say, I don’t know how beneficial LW was to EA’s development, so am not confident on how future exchange should go.
I tend to be frustrated by the general tendencies towards over-confidence, in-group jargon, and overrating the abilities or insights of their community/influences vs. others (esp. expert communities and traditional academic sources). Most references I see to ‘epistemics’ seem under-specified and not useful, and usually a short-hand way to dismiss a non-conforming view. I find it ironic that denigrating others’ ‘epistemics’ is a common LW refrain given my impression that the epistemic quality of LW seems poor.
There’s a kind of Gell-Mann amnesia effect I get where the LW discourse on things I know about decently well (medical advice, neuroscience, global health) I can easily see as wrong, poorly conceived and argued, and over-confident. I don’t have a clear personal view on the LW discourse on things I don’t know well like AI, but have occasionally seen ~similar takes to mine from some people who know AI well.
There are definitely writers/thinkers I admire from LW, but I usually admire them despite their LW-like tendencies. Losing their input would be a true loss. But for overall effect on EA, I doubt (with weak confidence) these exemplars outweigh the subpar majority.