What do you dislike about the LW style? Can you provide more specifics? There’s a big range of authors that have been publishing on LW for several years now.
I can try! But apologies as this will be vague—there’ll be lots of authors this doesn’t apply to, and this is my gestalt impression given I avoid reading much of it. And as I say, I don’t know how beneficial LW was to EA’s development, so am not confident on how future exchange should go.
I tend to be frustrated by the general tendencies towards over-confidence, in-group jargon, and overrating the abilities or insights of their community/influences vs. others (esp. expert communities and traditional academic sources). Most references I see to ‘epistemics’ seem under-specified and not useful, and usually a short-hand way to dismiss a non-conforming view. I find it ironic that denigrating others’ ‘epistemics’ is a common LW refrain given my impression that the epistemic quality of LW seems poor.
There’s a kind of Gell-Mann amnesia effect I get where the LW discourse on things I know about decently well (medical advice, neuroscience, global health) I can easily see as wrong, poorly conceived and argued, and over-confident. I don’t have a clear personal view on the LW discourse on things I don’t know well like AI, but have occasionally seen ~similar takes to mine from some people who know AI well.
There are definitely writers/thinkers I admire from LW, but I usually admire them despite their LW-like tendencies. Losing their input would be a true loss. But for overall effect on EA, I doubt (with weak confidence) these exemplars outweigh the subpar majority.
What do you dislike about the LW style? Can you provide more specifics? There’s a big range of authors that have been publishing on LW for several years now.
I can try! But apologies as this will be vague—there’ll be lots of authors this doesn’t apply to, and this is my gestalt impression given I avoid reading much of it. And as I say, I don’t know how beneficial LW was to EA’s development, so am not confident on how future exchange should go.
I tend to be frustrated by the general tendencies towards over-confidence, in-group jargon, and overrating the abilities or insights of their community/influences vs. others (esp. expert communities and traditional academic sources). Most references I see to ‘epistemics’ seem under-specified and not useful, and usually a short-hand way to dismiss a non-conforming view. I find it ironic that denigrating others’ ‘epistemics’ is a common LW refrain given my impression that the epistemic quality of LW seems poor.
There’s a kind of Gell-Mann amnesia effect I get where the LW discourse on things I know about decently well (medical advice, neuroscience, global health) I can easily see as wrong, poorly conceived and argued, and over-confident. I don’t have a clear personal view on the LW discourse on things I don’t know well like AI, but have occasionally seen ~similar takes to mine from some people who know AI well.
There are definitely writers/thinkers I admire from LW, but I usually admire them despite their LW-like tendencies. Losing their input would be a true loss. But for overall effect on EA, I doubt (with weak confidence) these exemplars outweigh the subpar majority.