Here Iāll keep track of additional intervention ideas that have occurred to me since I finished drafting this post. Perhaps in future Iāll integrate some into the post itself.
Creating and/āor improving EA-relevant journals
Could draw more people towards paying attention to important topics
Could make it easier for EAs doing graduate programs (especially PhDs) or pursuing academic careers to focus on high-priority topics and pursue them in the most impactful ways
That could in turn help with āIncreasing and/āor improving EAsā use of non-EA options for research training, credentials, etc.ā
Making high-quality data thatās relevant to high-priority topics more easily available
The idea here is that āa lot of researchers will follow good data wherever it comes fromā
(This was suggested by a commenter on a draft of this post)
I think a plausibly good training exercise for EAs wanting to be better at empirical/āconceptual research is to deep dive into seminal papers/āblog posts and attempt to identify all the empirical and conceptual errors in past work, especially writings by either a) other respected EAs or b) other stuff that we otherwise think of as especially important.
Iām not sure how knowledgeable you have to be to do this well, but I suspect itās approachable for smart people who finish high school, and certainly by the time they finish undergrad with a decent science or social science degree.
I think this is good career building for various reasons:
you can develop a healthy skepticism of the existing EA orthodoxy
I mean skepticism thatās grounded in specific beliefs about why things ought to be different, rather than just vague āweirdness heuristicsā or feeling like the goals of EA conflict with other tribal goals.
you actually deeply understand at least one topic well enough to point out errors
creates legible career capital (at least within EA)
requires relatively little training/āguidance from external mentors, meaning
our movement devotes less scarce mentorship resources into this
people with worse social skills/ānetwork/āgeographical situation donāt feel (as much) at a disadvantage for getting the relevant training
you can start forming your own opinions/āintuitions of both object-level and meta-level heuristics for what things are likely to be correct vs wrong.
In some cases, the errors are actually quite big, and worth correcting (relevant parts of ) the EA movement on.
Main āconsā I can think of:
Iām not aware of anybody successfully doing a really good critique for the sake of doing a really good critique. The most exciting things Iām aware of (publicly, zdgroffās critique of Ngās original paper on wild animal suffering, alexrjlās critique of Giving Green. I also have private examples) mostly comes from people trying to deeply understand a thing for themselves, and then along the way spotting errors with existing work.
Itās possible that doing deliberate āred-teamingā would make one predisposed to spot trivial issues rather than serious ones, or falsely identify issues where there arenāt any.
Maybe critiques are a less important skill to develop than forming your own vision/āresearch direction and executing on it, and telling people to train for this skill might actively hinder their ability to be bold & imaginative?
An idea from Buck (see also the comments on the linked shortform itself):
Hereās a crazy idea. I havenāt run it by any EAIF people yet.
I want to have a program to fund people to write book reviews and post them to the EA Forum or LessWrong. (This idea came out of a conversation with a bunch of people at a retreat; I canāt remember exactly whose idea it was.)
Basic structure:
Someone picks a book they want to review.
Optionally, they email me asking how on-topic I think the book is (to reduce the probability of not getting the prize later).
They write a review, and send it to me.
If itās the kind of review I want, I give them $500 in return for them posting the review to EA Forum or LW with a āThis post sponsored by the EAIFā banner at the top. (Iād also love to set up an impact purchase thing but thatās probably too complicated).
If I donāt want to give them the money, they can do whatever with the review.
What books are on topic: Anything of interest to people who want to have a massive altruistic impact on the world. More specifically:
Things directly related to traditional EA topics
Things about the world more generally. Eg macrohistory, how do governments work, The Doomsday Machine, history of science (eg Asimovās āA Short History of Chemistryā)
I think that books about self-help, productivity, or skill-building (eg management) are dubiously on topic.
Goals:
I think that these book reviews might be directly useful. There are many topics where Iād love to know the basic EA-relevant takeaways, especially when combined with basic fact-checking.
It might encourage people to practice useful skills, like writing, quickly learning about new topics, and thinking through what topics would be useful to know more about.
I think it would be healthy for EAās culture. I worry sometimes that EAs arenāt sufficiently interested in learning facts about the world that arenāt directly related to EA stuff. I think that this might be improved both by people writing these reviews and people reading them.
Conversely, sometimes I worry that rationalists are too interested in thinking about the world by introspection or weird analogies relative to learning many facts about different aspects of the world; I think book reviews would maybe be a healthier way to direct energy towards intellectual development.
It might surface some talented writers and thinkers who werenāt otherwise known to EA.
It might produce good content on the EA Forum and LW that engages intellectually curious people.
Suggested elements of a book review:
One paragraph summary of the book
How compelling you found the bookās thesis, and why
The main takeaways that relate to vastly improving the world, with emphasis on the surprising ones
Rough notes on another idea, following a call I just had:
Setting up something in between a research training program and a system for collaborations in high schools, universities, or local EA groups
Less vetting and probably lower average current knowledge, aptitude, etc. than research training program participants undergo/āhave
But this reduces the costs for vetting
And this opens this up to an additional pool of people (who may not yet be able to pass that vetting)
Plus, this could allow more people to test their fit for and get better at mentorship, by mentoring people in these āprogramsā or simply by collaborating with peers in these programs (since collaboration still has some mentorship-like elements)
E.g., in some cases, someoneās who just started a PhD student or just recently learned about the cause area theyāre now focused on may not be able to usefully serve as a mentor for a participant in a research training program like SERI, but they may be able to usefully serve as a mentor for a high school student or some other undergrads
(Iām just saying thereād be some cases in that space in betweenāthereād also be some e.g. PhD students who can usefully serve as mentors for SERI fellows, and some who canāt usefullyserve as mentors for high school students)
Additional intervention ideas
Here Iāll keep track of additional intervention ideas that have occurred to me since I finished drafting this post. Perhaps in future Iāll integrate some into the post itself.
Creating and/āor improving EA-relevant journals
Could draw more people towards paying attention to important topics
Could make it easier for EAs doing graduate programs (especially PhDs) or pursuing academic careers to focus on high-priority topics and pursue them in the most impactful ways
That could in turn help with āIncreasing and/āor improving EAsā use of non-EA options for research training, credentials, etc.ā
Making high-quality data thatās relevant to high-priority topics more easily available
The idea here is that āa lot of researchers will follow good data wherever it comes fromā
(This was suggested by a commenter on a draft of this post)
An idea from Linch:
(See also the comments on the shortform.)
An idea from Buck (see also the comments on the linked shortform itself):
Rough notes on another idea, following a call I just had:
Setting up something in between a research training program and a system for collaborations in high schools, universities, or local EA groups
Less vetting and probably lower average current knowledge, aptitude, etc. than research training program participants undergo/āhave
But this reduces the costs for vetting
And this opens this up to an additional pool of people (who may not yet be able to pass that vetting)
Plus, this could allow more people to test their fit for and get better at mentorship, by mentoring people in these āprogramsā or simply by collaborating with peers in these programs (since collaboration still has some mentorship-like elements)
E.g., in some cases, someoneās who just started a PhD student or just recently learned about the cause area theyāre now focused on may not be able to usefully serve as a mentor for a participant in a research training program like SERI, but they may be able to usefully serve as a mentor for a high school student or some other undergrads
(Iām just saying thereād be some cases in that space in betweenāthereād also be some e.g. PhD students who can usefully serve as mentors for SERI fellows, and some who canāt usefully serve as mentors for high school students)