I think a plausibly good training exercise for EAs wanting to be better at empirical/âconceptual research is to deep dive into seminal papers/âblog posts and attempt to identify all the empirical and conceptual errors in past work, especially writings by either a) other respected EAs or b) other stuff that we otherwise think of as especially important.
Iâm not sure how knowledgeable you have to be to do this well, but I suspect itâs approachable for smart people who finish high school, and certainly by the time they finish undergrad with a decent science or social science degree.
I think this is good career building for various reasons:
you can develop a healthy skepticism of the existing EA orthodoxy
I mean skepticism thatâs grounded in specific beliefs about why things ought to be different, rather than just vague âweirdness heuristicsâ or feeling like the goals of EA conflict with other tribal goals.
you actually deeply understand at least one topic well enough to point out errors
creates legible career capital (at least within EA)
requires relatively little training/âguidance from external mentors, meaning
our movement devotes less scarce mentorship resources into this
people with worse social skills/ânetwork/âgeographical situation donât feel (as much) at a disadvantage for getting the relevant training
you can start forming your own opinions/âintuitions of both object-level and meta-level heuristics for what things are likely to be correct vs wrong.
In some cases, the errors are actually quite big, and worth correcting (relevant parts of ) the EA movement on.
Main âconsâ I can think of:
Iâm not aware of anybody successfully doing a really good critique for the sake of doing a really good critique. The most exciting things Iâm aware of (publicly, zdgroffâs critique of Ngâs original paper on wild animal suffering, alexrjlâs critique of Giving Green. I also have private examples) mostly comes from people trying to deeply understand a thing for themselves, and then along the way spotting errors with existing work.
Itâs possible that doing deliberate âred-teamingâ would make one predisposed to spot trivial issues rather than serious ones, or falsely identify issues where there arenât any.
Maybe critiques are a less important skill to develop than forming your own vision/âresearch direction and executing on it, and telling people to train for this skill might actively hinder their ability to be bold & imaginative?
An idea from Linch:
(See also the comments on the shortform.)