I’ve had calls with >30 people who are interested in things like testing their fit for EA-aligned research careers, writing on the Forum, “getting up to speed” on areas of EA, etc. (This is usually during EA conferences.)
I gradually collected a set of links and notes that I felt that many such people would benefit from seeing, then turned that into a Google Doc. Many people told me they found that doc useful, so I’m now (a) sharing it as a public post, and (b) still entertaining the hypothesis that those people were all just vicious liars and sycophants, of course.
Disclaimers
Not all of these links/​notes will be relevant to any given person
These links/​notes are most relevant to people interested in (1) research roles, (2) roles at explicitly EA organisations, and/​or (3) longtermism
But this just because that’s what I know best
There are of course many important roles that aren’t about research or aren’t at EA orgs!
And I’m happy with many EAs prioritising cause areas other than longtermism
But, in any case, some of the links/​notes will also be relevant to other people and pathways
This doc mentions some orgs I work for or have worked for previously, but the opinions expressed here are my own, and I wrote the post (and the doc it evolved from) in a personal capacity
This has great writing tips that definitely apply on the Forum, and ideally would apply everywhere, but unfortunately they don’t perfectly align with the norms in some areas/​fields
“deep dive into seminal papers/​blog posts and attempt to identify all the empirical and conceptual errors in past work, especially writings by either a) other respected EAs or b) other stuff that we otherwise think of as especially important.” (see Linch’s shortform)
I also encourage you to consider reading, commenting, and posting on LessWrong
Sometimes people worry that a post idea might be missing some obvious, core insight, or just replicating some other writing you haven’t come across. I think this is mainly a problem only inasmuch as it could’ve been more efficient for you to learn things than slowly craft a post.
So if you can write (a rough version of) the post quickly, you could just do that.
Or you could ask around or make a quick Question post to outline the basic idea and ask if anyone knows of relevant things you should read.
Note: I know less about what the opportunities at the Center for Reducing Suffering and the Nonlinear Fund would be like than I know about what the other opportunities would be like, so I’m not necessarily able to personally endorse those two opportunities.
Perhaps ideally with a researcher supervisor who’s an excellent researcher themselves, and who is either working on high-priority topics or is flexible enough to accommodate you working on high-priority topics
This sequence is targeted more at people who could help improve the pipeline than at people aiming to enter and move through that pipeline, but various parts could still be useful to the latter group of people.
One key piece of career advice I want to highlight is that people often apply to too few things, don’t make enough ambitious applications, and/​or don’t make enough “safe applications”
I think it’s generally good to apply to a lot of things, including both ambitious and safe options
(But of course, any piece of general career advice will have some exceptions)
Notes on EA-related research, writing, testing fit, learning, and the Forum
UPDATE: I now consider my 2022 Interested in EA/​longtermist research careers? Here are my top recommended resources a better starting point than this older post, but this post might be useful after you’ve read that 2022 one.
Cross-posted to LessWrong.
I’ve had calls with >30 people who are interested in things like testing their fit for EA-aligned research careers, writing on the Forum, “getting up to speed” on areas of EA, etc. (This is usually during EA conferences.)
I gradually collected a set of links and notes that I felt that many such people would benefit from seeing, then turned that into a Google Doc. Many people told me they found that doc useful, so I’m now (a) sharing it as a public post, and (b) still entertaining the hypothesis that those people were all just vicious liars and sycophants, of course.
Disclaimers
Not all of these links/​notes will be relevant to any given person
These links/​notes are most relevant to people interested in (1) research roles, (2) roles at explicitly EA organisations, and/​or (3) longtermism
But this just because that’s what I know best
There are of course many important roles that aren’t about research or aren’t at EA orgs!
And I’m happy with many EAs prioritising cause areas other than longtermism
But, in any case, some of the links/​notes will also be relevant to other people and pathways
This doc mentions some orgs I work for or have worked for previously, but the opinions expressed here are my own, and I wrote the post (and the doc it evolved from) in a personal capacity
Regarding writing, the Forum, etc.
Why you (yes, you) should post on the EA Forum (a talk by Aaron Gertler)
How you can make an impact on the EA Forum (another talk by Aaron Gertler)
Aaron also made a companion document which I think is useful even if you don’t watch the talk
Feedback available for EA Forum drafts (a post by that Aaron Gertler guy I’ve been hearing so much about lately)
Effective Altruism Editing and Review
Reasoning Transparency
This has great writing tips that definitely apply on the Forum, and ideally would apply everywhere, but unfortunately they don’t perfectly align with the norms in some areas/​fields
Readings and notes on how to write/​communicate well
Reasons for and against posting on the EA Forum
Regarding types of writing you can do:
Write about any of the research ideas in the links in the next section
Write summaries and/​or collections
Write book reviews (see Buck’s suggestion, my suggestion, and posts tagged books)
“deep dive into seminal papers/​blog posts and attempt to identify all the empirical and conceptual errors in past work, especially writings by either a) other respected EAs or b) other stuff that we otherwise think of as especially important.” (see Linch’s shortform)
I also encourage you to consider reading, commenting, and posting on LessWrong
See Welcome to LessWrong! for a great introduction to that site
Sometimes people worry that a post idea might be missing some obvious, core insight, or just replicating some other writing you haven’t come across. I think this is mainly a problem only inasmuch as it could’ve been more efficient for you to learn things than slowly craft a post.
So if you can write (a rough version of) the post quickly, you could just do that.
Or you could ask around or make a quick Question post to outline the basic idea and ask if anyone knows of relevant things you should read.
Research ideas
Research questions that could have a big social impact, organised by discipline
A central directory for open research questions
Crucial questions for longtermists
Some history topics it might be very valuable to investigate
This is somewhat less noteworthy than the other links
Programs, approaches, or tips for testing fit for (longtermism-related) research
Programs
Not all of these things are necessarily “open” right now.
Here are things I would describe as research training programs (in alphabetical order to avoid picking favourites):
ALLFED volunteering
Center for Reducing Suffering—“Get involved”
CLR Summer Research Fellowship
Fellowship – Legal Priorities Project
FHI Summer Research Fellowship
GCRI call for collaborators/​advisees
Governance of AI Fellowship—Future of Humanity Institute
GPI Early Career Conference Program & Forethought Global Priorities Fellowship
Nonlinear Fund
SERI (Stanford Existential Risk Initiative) Summer 2021 Research Fellowship
Rethink Priorities—Research Internship
I have a more thorough, less curated list here
Note: I know less about what the opportunities at the Center for Reducing Suffering and the Nonlinear Fund would be like than I know about what the other opportunities would be like, so I’m not necessarily able to personally endorse those two opportunities.
Other things
Effective Thesis
See in particular Resources page
Consider doing a graduate degree?
Perhaps ideally with a researcher supervisor who’s an excellent researcher themselves, and who is either working on high-priority topics or is flexible enough to accommodate you working on high-priority topics
You could contact Effective Thesis for advice on this
Offer to be a research assistant for someone doing great work?
See Matthew Van Der Merwe’s thoughts on RA roles
See my Notes from a call with someone who’s an RA to a great researcher
These roles can sometimes be done remotely
Annotated List of Project Ideas & Volunteering Resources—EA Forum
Grants
LTFF
CLR
SAF
List of EA funding opportunities
Things I often tell people about applying to EA Funds
Undergraduate thesis prize — Forethought Foundation
Improving the EA-aligned research pipeline: Sequence introduction
This sequence is targeted more at people who could help improve the pipeline than at people aiming to enter and move through that pipeline, but various parts could still be useful to the latter group of people.
Readings and notes on how to do high-impact research
Readings and notes on how to do high-quality, efficient research
Collection of collections of resources relevant to (research) management, mentorship, training, etc.
Getting “up to speed” on EA, longtermism, x-risks, etc.
The Precipice
Crucial questions for longtermists
A ranked list of all EA-relevant (audio)books I’ve read
A list of EA-related podcasts
Where to find EA-related videos
What are some good online courses relevant to EA?
How I personally “got up to speed”
You can search the EA Forum tags page for relevant tags, read those pages (since they’re also Wiki entries), and use them as directories to useful posts, authors to talk to, etc.
I now make and use Anki cards, and would encourage other people to consider doing so too
Other
Career planning: An in-depth process & template (80,000 Hours)
One key piece of career advice I want to highlight is that people often apply to too few things, don’t make enough ambitious applications, and/​or don’t make enough “safe applications”
I think it’s generally good to apply to a lot of things, including both ambitious and safe options
(But of course, any piece of general career advice will have some exceptions)
Get Involved
Research Methods
EA Creatives and Communicators Slack
I’d welcome comments suggesting other relevant links, or just sharing people’s own thoughts on any of the topics addressed above!