Notes on EA-related research, writing, testing fit, learning, and the Forum
UPDATE: I now consider my 2022 Interested in EA/longtermist research careers? Here are my top recommended resources a better starting point than this older post, but this post might be useful after you’ve read that 2022 one.
I’ve had calls with >30 people who are interested in things like testing their fit for EA-aligned research careers, writing on the Forum, “getting up to speed” on areas of EA, etc. (This is usually during EA conferences.)
I gradually collected a set of links and notes that I felt that many such people would benefit from seeing, then turned that into a Google Doc. Many people told me they found that doc useful, so I’m now (a) sharing it as a public post, and (b) still entertaining the hypothesis that those people were all just vicious liars and sycophants, of course.
Disclaimers
Not all of these links/notes will be relevant to any given person
These links/notes are most relevant to people interested in (1) research roles, (2) roles at explicitly EA organisations, and/or (3) longtermism
But this just because that’s what I know best
There are of course many important roles that aren’t about research or aren’t at EA orgs!
And I’m happy with many EAs prioritising cause areas other than longtermism
But, in any case, some of the links/notes will also be relevant to other people and pathways
This doc mentions some orgs I work for or have worked for previously, but the opinions expressed here are my own, and I wrote the post (and the doc it evolved from) in a personal capacity
Regarding writing, the Forum, etc.
Why you (yes, you) should post on the EA Forum (a talk by Aaron Gertler)
How you can make an impact on the EA Forum (another talk by Aaron Gertler)
Aaron also made a companion document which I think is useful even if you don’t watch the talk
Feedback available for EA Forum drafts (a post by that Aaron Gertler guy I’ve been hearing so much about lately)
This has great writing tips that definitely apply on the Forum, and ideally would apply everywhere, but unfortunately they don’t perfectly align with the norms in some areas/fields
Regarding types of writing you can do:
Write about any of the research ideas in the links in the next section
Write book reviews (see Buck’s suggestion, my suggestion, and posts tagged books)
“deep dive into seminal papers/blog posts and attempt to identify all the empirical and conceptual errors in past work, especially writings by either a) other respected EAs or b) other stuff that we otherwise think of as especially important.” (see Linch’s shortform)
I also encourage you to consider reading, commenting, and posting on LessWrong
See Welcome to LessWrong! for a great introduction to that site
Sometimes people worry that a post idea might be missing some obvious, core insight, or just replicating some other writing you haven’t come across. I think this is mainly a problem only inasmuch as it could’ve been more efficient for you to learn things than slowly craft a post.
So if you can write (a rough version of) the post quickly, you could just do that.
Or you could ask around or make a quick Question post to outline the basic idea and ask if anyone knows of relevant things you should read.
Research ideas
Research questions that could have a big social impact, organised by discipline
Some history topics it might be very valuable to investigate
This is somewhat less noteworthy than the other links
Programs, approaches, or tips for testing fit for (longtermism-related) research
Programs
Not all of these things are necessarily “open” right now.
Here are things I would describe as research training programs (in alphabetical order to avoid picking favourites):
GPI Early Career Conference Program & Forethought Global Priorities Fellowship
SERI (Stanford Existential Risk Initiative) Summer 2021 Research Fellowship
I have a more thorough, less curated list here
Note: I know less about what the opportunities at the Center for Reducing Suffering and the Nonlinear Fund would be like than I know about what the other opportunities would be like, so I’m not necessarily able to personally endorse those two opportunities.
Other things
See in particular Resources page
Consider doing a graduate degree?
Perhaps ideally with a researcher supervisor who’s an excellent researcher themselves, and who is either working on high-priority topics or is flexible enough to accommodate you working on high-priority topics
You could contact Effective Thesis for advice on this
Offer to be a research assistant for someone doing great work?
See my Notes from a call with someone who’s an RA to a great researcher
These roles can sometimes be done remotely
Annotated List of Project Ideas & Volunteering Resources—EA Forum
Grants
Improving the EA-aligned research pipeline: Sequence introduction
This sequence is targeted more at people who could help improve the pipeline than at people aiming to enter and move through that pipeline, but various parts could still be useful to the latter group of people.
Readings and notes on how to do high-quality, efficient research
Collection of collections of resources relevant to (research) management, mentorship, training, etc.
Getting “up to speed” on EA, longtermism, x-risks, etc.
The Precipice
You can search the EA Forum tags page for relevant tags, read those pages (since they’re also Wiki entries), and use them as directories to useful posts, authors to talk to, etc.
I now make and use Anki cards, and would encourage other people to consider doing so too
Other
Career planning: An in-depth process & template (80,000 Hours)
One key piece of career advice I want to highlight is that people often apply to too few things, don’t make enough ambitious applications, and/or don’t make enough “safe applications”
I think it’s generally good to apply to a lot of things, including both ambitious and safe options
(But of course, any piece of general career advice will have some exceptions)
I’d welcome comments suggesting other relevant links, or just sharing people’s own thoughts on any of the topics addressed above!
- Don’t think, just apply! (usually) by 12 Apr 2022 8:06 UTC; 141 points) (
- Interested in EA/longtermist research careers? Here are my top recommended resources by 26 Jun 2022 17:13 UTC; 112 points) (
- Nuclear risk research ideas: Summary & introduction by 8 Apr 2022 11:17 UTC; 103 points) (
- On the longtermist case for working on farmed animals [Uncertainties & research ideas] by 11 Apr 2021 6:49 UTC; 78 points) (
- Some readings & notes on how to do high-quality, efficient research by 17 Nov 2021 17:01 UTC; 69 points) (
- Intervention options for improving the EA-aligned research pipeline by 28 May 2021 14:26 UTC; 49 points) (
- Reasons for and against posting on the EA Forum by 23 May 2021 11:29 UTC; 32 points) (
- 30 Mar 2022 1:47 UTC; 16 points) 's comment on Impactful Forecasting Prize for forecast writeups on curated Metaculus questions by (
- 28 May 2021 14:34 UTC; 14 points) 's comment on Intervention options for improving the EA-aligned research pipeline by (
- Notes on effective-altruism-related research, writing, testing fit, learning, and the EA Forum by 28 Mar 2021 23:43 UTC; 14 points) (LessWrong;
- 31 Dec 2021 11:54 UTC; 11 points) 's comment on Six Takeaways from EA Global and EA Retreats by (
- 20 Nov 2021 16:40 UTC; 10 points) 's comment on EA Communication Project Ideas by (
- 3 Jan 2022 18:04 UTC; 8 points) 's comment on MichaelA’s Quick takes by (
- 17 Aug 2021 11:41 UTC; 8 points) 's comment on Needed: Input on testing fit for your career by (
- 23 Jun 2021 6:27 UTC; 4 points) 's comment on Linch’s Quick takes by (
- 13 May 2021 6:53 UTC; 3 points) 's comment on Improving the EA-aligned research pipeline: Sequence introduction by (
- 19 May 2021 15:59 UTC; 2 points) 's comment on MichaelA’s Quick takes by (
- 4 May 2021 7:06 UTC; 2 points) 's comment on Linch’s Quick takes by (
I definitely agree that one of the best things applicants interested in roles at organizations like ours can do to improve their odds of being a successful researcher is to read and write independent research for this forum and get feedback from the community.
I think another underrated way to acquire a credible and relevant credential is to become a top forecaster on Metaculus, Good Judgement Open, or Facebook’s Forecastapp.
Thanks for sharing, Michael!
I think the Center for Reducing Suffering’s Open Research Questions may be a helpful addition to Research ideas. (Do let me know if you think otherwise!)
Relatedly, CRS has an internship opportunity.
Also, perhaps this is intentional but “Readings and notes on how to do high-impact research” is repeated twice in the list.
Thanks for mentioning this—I’ll now added it to the “Programs [...]” section :)
This was intentional, but I think I no longer endorse that decision, so I’ve now removed the second mention.
I definitely think that that list is within-scope for this document, but (or “and relatedly”) I’ve already got it in the Central directory for open research questions that’s linked to from here.
There are many relevant collections of research questions, and I’ve already included all the ones I’m aware of in that other post. So I think it doesn’t make sense to add any here unless I think the collection is especially worth highlighting to people interested in testing their fit for (longtermism-related) research.
I think the 80k collection fits that bill due to being curated, organised by discipline, and aimed at giving a representative sense of many different areas. I think my “Crucial questions” post fits that bill due to being aimed at overviewing the whole landscape of longtermism in a fairly comprehensive and structured way (though of course, there’s probably some bias in my assessment here!).
I think my history topics collection fits that bill, but I’m less sure. So I’ve now added below it the disclaimer “This is somewhat less noteworthy than the other links”.
I think my RSP doc doesn’t fit that bill, really, so in the process of writing this comment I’ve decided to move that out of this post and into my Central directory post.
(The fact that this post evolved out of notes I shared with people also helps explain why stuff I wrote has perhaps undue prominence here.)
Here’s one other section that was in the doc. I’m guessing this section will be less useful to the average person than the other sections, so I’ve “demoted” it to a comment.
Some quick thoughts regarding the value of posting on the Forum and/or conducting independent research, in my experience
Note that:
This section is lightly edited from what I wrote ~August 2020; I didn’t bother fully updating it with newer evidence and thoughts
This may of course not generalise to other people.
Some of this work was independent, some was associated with Convergence Analysis (who I worked for), and some was in between
Doing this definitely improved my thinking, my network, and how well-known I am among EAs
Not sure how much the third thing actually matters
Doing this seems to have accelerated my career trajectory via the above and via providing evidence of my abilities
See also this comment
I have some evidence of impact from my work
See my data and reflections here
Various people have cited my posts
Perhaps most often my database of existential risk (or similar) estimates
Sometimes those people were prominent EA researchers I respect
Mostly this was in other Forum posts or comments
But also at least twice in potentially important non-Forum reports
E.g., a Founder’s Pledge report
Often the citation seemed to suggest my work was a handy summary of an idea the person wanted to mention, rather than strongly suggesting that my posts advanced the person’s own thinking in a key way
But sometimes it seemed more like the latter
E.g., I was told that Research questions that could have a big social impact, organised by discipline was kind-of based on or made much easier by my Central directory of open research questions
I was told that an organisation I respect had a discussion about what their theory of change was and should be, prompted by a related post of mine
The network-building/signalling from this may have also helped me have impact in other ways
E.g., I was contacted to review the above-mentioned Founders Pledge report, and was told my comments helped improve the report
Some people might also find it useful to check out EA-related facebook groups, which there’s a directory for here: https://www.facebook.com/EffectiveGroups/
Thanks, Michael!
The list of summer research training programs seems helpful. There might be some newer ones that are worth adding too.
Yeah, thanks for point this out! SERI seems cool to me, and I’ve now added a link to that form :)
(I actually added the link right before you made your comment, I think, due to someone else highlighting it to me in a different context. But it was indeed absent from the initial version of the post.)
See also https://forum.effectivealtruism.org/posts/ZA5HNrc8AtWGub4fk/ea-internships-board-now-live