I seem to have quite strongly differing intuitions from most people active in central EA roles, and quite similar ones (at least about the limitations to EA-style research) to many people I’ve spoken to who believe the motte of EA but are sceptical of the bailey (ie of actual EA orgs and methodology). I worry that EA has very strong echo chamber effects reflected in eg the OP, in Linch’s comment below, and Hauke’s about Bill Gates, in various other comments in this thread suggesting ‘almost no-one’ thinks about these questions with clarity and in countless of other such casual dismissals I’ve heard by EAs of smart people taking positions not couched in sufficiently EA terms.
FWIW I also don’t think claiming someone has lots of other great qualities is inconsistent with being insulting to them.
I don’t disagree that it’s plausible we can bring something. I just think that assuming we can do so is extremely arrogant (not by you in particular, but as a generalised attitude among EAs). We need to respect the views of intelligent people who think this stuff is important, even if they can’t or don’t explain why in the terms we would typically use. For PR reasons alone, this stuff is important—I can only point to anecdotes, but so many intelligent people I’ve spoken to find EAs collectively insufferable because of this sort of attitude, and so end up not engaging with ideas that might otherwise have appealed to them. Maybe someone could run a Mechanical Turk study on how such messaging affects reception of theoretically unrelated EA ideas.
I seem to have quite strongly differing intuitions from most people active in central EA roles, and quite similar ones (at least about the limitations to EA-style research) to many people I’ve spoken to who believe the motte of EA but are sceptical of the bailey (ie of actual EA orgs and methodology). I worry that EA has very strong echo chamber effects reflected in eg the OP, in Linch’s comment below, and Hauke’s about Bill Gates, in various other comments in this thread suggesting ‘almost no-one’ thinks about these questions with clarity and in countless of other such casual dismissals I’ve heard by EAs of smart people taking positions not couched in sufficiently EA terms.
FWIW I also don’t think claiming someone has lots of other great qualities is inconsistent with being insulting to them.
I don’t disagree that it’s plausible we can bring something. I just think that assuming we can do so is extremely arrogant (not by you in particular, but as a generalised attitude among EAs). We need to respect the views of intelligent people who think this stuff is important, even if they can’t or don’t explain why in the terms we would typically use. For PR reasons alone, this stuff is important—I can only point to anecdotes, but so many intelligent people I’ve spoken to find EAs collectively insufferable because of this sort of attitude, and so end up not engaging with ideas that might otherwise have appealed to them. Maybe someone could run a Mechanical Turk study on how such messaging affects reception of theoretically unrelated EA ideas.