While justifying this claim is beyond the scope of this post, EA epistemics are generally worse than I have experienced in my job and elsewhere, in my subjective estimation
I get why you didn’t want to open this can of worms, but this seems like a really big deal to me and I would love to hear more, here or in private.
First, is that the EA community tends to prefer information from “EA-aligned” people on a topic, rather than from academic experts in that topic. I’ve noticed this in climate change mitigation, air quality, and aerosol-based disease transmission (topics I’m an expert in). I presume that same issue is in other cause areas as well.
Second, the EA shift from global health and animal causes towards longtermism-focused efforts has corresponded with less reliance on RCTs and provable statements towards unprovable claims from argument. Longtermism arguments by necessity can’t be tested. But it also means that uncontested beliefs / worldviews heavily influence what arguments are seen as robust/valid. It’s a matter of personal perception whether the SBF fraud and FTX collapse was an inevitable consequence of longtermist beliefs and worldviews, or a separate, perturbed version of it. Either way, the consequence is the same: For many, EA is now a low-trust environment, rather than the high-trust environment it was in the beginning.
I’m increasingly relying on sources outside EA to inform my EA-related giving (important exceptions—I think GiveWell and Rethink Priorities are excellent, reliable sources in the EA community).
Strong +1 on the first point. EA folks have done good work in these areas but it’s swamped by the good work done by outsiders that I never see referenced.
I get why you didn’t want to open this can of worms, but this seems like a really big deal to me and I would love to hear more, here or in private.
I’ve noticed this too, in two ways.
First, is that the EA community tends to prefer information from “EA-aligned” people on a topic, rather than from academic experts in that topic. I’ve noticed this in climate change mitigation, air quality, and aerosol-based disease transmission (topics I’m an expert in). I presume that same issue is in other cause areas as well.
Second, the EA shift from global health and animal causes towards longtermism-focused efforts has corresponded with less reliance on RCTs and provable statements towards unprovable claims from argument. Longtermism arguments by necessity can’t be tested. But it also means that uncontested beliefs / worldviews heavily influence what arguments are seen as robust/valid. It’s a matter of personal perception whether the SBF fraud and FTX collapse was an inevitable consequence of longtermist beliefs and worldviews, or a separate, perturbed version of it. Either way, the consequence is the same: For many, EA is now a low-trust environment, rather than the high-trust environment it was in the beginning.
I’m increasingly relying on sources outside EA to inform my EA-related giving (important exceptions—I think GiveWell and Rethink Priorities are excellent, reliable sources in the EA community).
Strong +1 on the first point. EA folks have done good work in these areas but it’s swamped by the good work done by outsiders that I never see referenced.