Seethe post introducing this sequencefor context, caveats, credits, and links to prior discussion relevant to this sequence as a whole. This post doesn’t necessarily represent the views of my employers.
Summary
In a previous post, I highlighted some observations that I think collectively demonstrate that the current processes by which new EA-aligned research and researchers are “produced” are at least somewhat insufficient, inefficient, and prone to error. In this post, I’ll briefly discuss 19 interventions thatmightimprove that situation. I discuss them in very roughly descending order of how important, tractable, and neglected I think each intervention is, solely from the perspective of improving the EA-aligned research pipeline.[1] The interventions are:
Creating, scaling, and/or improving[2] EA-aligned research orgs
Creating, scaling, and/or improving EA-aligned research training programs (e.g. certain types of internships or summer research fellowships)
Increasing and/or improving the use of relevant online forums
Increasing the number of EA-aligned aspiring/junior researchers
Increasing the amount of funding available for EA-aligned research(ers)
Discovering, writing, and/or promoting positive case studies
Feel free to skip to sections that interest you; each section should make sense by itself.
Target audience
As with the rest of this sequence:
This post is primarily intended to inform people who are helping implement or fund interventions to improve the EA-aligned research pipeline, or who could potentially do so in future
But it may also help people who hope to themselves “enter” and “progress through” the EA-aligned research pipeline
(For illustration, I’ve added a comment below this post regarding how my own career, project, and donation decisions have been influenced by thinking about why and how the EA-aligned research pipeline should be improved.)
Caveats and clarifications
Versions of many of these interventions already exist or have already been proposed
There are various other ways to carve up the space of options, various complementary framings that can be useful, etc.[4]
Many of these interventions would also or primarily have benefits unrelated to improving the EA-aligned research pipeline
These interventions differ in their importance (in general or for improving the EA-aligned research pipeline specifically), neglectedness, and tractability
And I haven’t gathered systematic data on those things[5]
Specific versions of a given intervention, or specific combinations of those interventions, would also differ on those variables[6]
Some of these options interventions—or some versions of them—might not actually be worthwhile or even net-positive
These interventions differ in which aspects of the EA-aligned research pipeline they’d (primarily) improve
To keep this post (relatively!) brief, I don’t fully explain or justify all the points I make, nor mention all the points that come to mind regarding each intervention
I’m happy to provide further thoughts in replies to comments
I’m sure I’ve failed to mention some promising intervention options, and I’d welcome comments that mention additional ideas (whether they’re the commenter’s own idea or something that has been proposed elsewhere)
The intervention options
Creating, scaling, and/or improving EA-aligned research orgs
It seems like EA-aligned research orgs should house a substantial fraction of EA-aligned researchers and handle a substantial fraction of vetting, training, etc. for aspiring/junior EA-aligned researchers[7]
Though not necessarily the majority; there is also a place for grantmakers, non-EA orgs, independent research, etc.
There would be more capacity for that if EA-aligned research orgs were more numerous, larger, and/or better
Some things that would help us move towards that situation include:
Org leadership teams consciously thinking about how they can scale gracefully yet relatively quickly, designing their systems and strategies around that, building strong operations teams, hiring with that in mind (e.g., looking for people who could in future manage other hires), and providing staff with opportunities to build their management skills
Individuals trying to build skills in or pursue roles related to management, mentorship, or perhaps operations, and perhaps considering founding new orgs
Funders could try to fund orgs which seem likely to scale gracefully yet relatively quickly (if given more funding), fund the creation of new orgs (especially those that could scale well), and engage in “active funding” to create more such funding opportunities (see also field building)
The SERI team’s efforts to encourage and support the creation of programs similar to themselves
My creation of a Slack workspace for people who are (or are planning to be) involved in organising such programs to exchange resources and idas, ask questions, share resources, etc.
By “grantmaking capacity”, I mean the collective capacity grantmakers and others have to evaluate and/or create[8] funding opportunities
I don’t mean available funding; I have a separate section below on increasing available funding
Relevant individuals include people who work as grantmakers, other people who give donation recommendations, and people who make decisions about where to donate their own money
Ways grantmaking capacity could be increased include hiring or training new grantmakers, increasing the time spent on grantmaking by people who currently do it part-time, distributing funding to other individuals for regranting, and creating or scaling charity evaluators
Increasing grantmaking capacity and/or improving grantmaking processes could improve the EA-aligned research pipeline by increasing the amount and efficiency of financial support for aspiring/junior researchers and/or for work on any of the other interventions discussed in this post
I tentatively think it’d be good for Effective Thesis to expand in some way, and/or for additional things sort-of like Effective Thesis to be created
But I haven’t really thought about this much yet, and so:
For all I know, it might be the case that Effective Thesis are already doing most of the most valuable and tractable things in this space
I’m not really sure what, specifically, scaling, improving, or creating new things sort-of like Effective Thesis should look like
If this involves new orgs/projects, they could try somewhat different strategies and approaches to those used by Effective Thesis, or specialise more for particular user groups or topic areas
And they could share resources and learnings with Effective Thesis, and vice versa
(My understanding is that this would be analogous to the current situation in the EA-aligned career advice space, where the relevant organisations include 80,000 Hours, Animal Advocacy Careers, and Probably Good)
Increasing and/or improving EAs’ use of non-EA options for research-relevant training, credentials, testing fit, etc.
The next post in this sequence will focus on this idea, so I won’t discuss it here
Increasing and/or improving research by non-EAs on high-priority topics
Funding the creation of non-EA orgs, institutes, etc. focused on these topics (e.g., CSET[10])
Making it (seem) easier to publish respectable papers on these topics
Running conferences or workshops on these topics
Increasing interactions between EA and non-EA researchers
Providing guidance to non-EA research on these topics
Shifting academic norms and incentives towards choosing research for its impact potential
More concretely, this could be done through things like:
Organising workshops on the topic
Publishing papers on a high-priority topic (which could raise the topic’s salience, make publishing on it seem more acceptable, give people things to cite)
Inviting non-EAs to visit EA research institutes/orgs
Providing the kind of resources and coaching Effective Thesis provides
Scoping EA-aligned research directions in a way that makes them easier for people working in traditional academia to learn about, see the relevance of, connect to established disciplines, and work on
The GovAI and GPI research agendas could be seen as two examples of this sort of effort
Creating prizes or awards for the best research on a topic, and trying to make the prize/award sufficiently large, prestigious, and well-advertised in relevant places that top or promising non-EA researchers are drawn towards it
In addition to improve the pipeline for EA-aligned research produced by non-EAs, this might also improve the pipeline for EA-aligned researchers, such as by:
Causing longer-term shifts in the views of some of the non-EAs reached
Making it easier for EAs’ to use non-EA options for research training, credentials, etc. (see my next post)
And these benefits could perhaps be huge, as the vast majority of all research talent, funding, hours, etc. are outside of EA
On the other hand, it may be less tractable for “us” to increase and/or improve that pool of talent, funding, hours, etc., compared to doing so for the EA pool
Creating a central, editable database to help people choose and do research projects
The sixth post in this sequence will focus on this idea
Using Elicit (an automated research assistant tool) or a similar tool
The sixth post in this sequence will discuss this idea
Forecasting the impact projects will have
The sixth post in this sequence will discuss this idea
Adding to and/or improving options for collaborations, mentorship, feedback, etc. (including from peers)
This could include things like:
Encouraging and facilitating aspiring/junior researchers in connecting with each other to get feedback on plans, get feedback on drafts, collaborate, start coworking teams, and run focused practice sessions[11]
E.g., perhaps, creating platforms like Impact CoLabs
(Those are just the first three examples that came to mind; there are probably other, quite different ways to achieve this goal)
Encouraging and facilitating aspiring/junior researchers and more experienced researchers to connect in similar ways
This could involve the aspiring/junior researcher acting as a research assistant
(Two out of two people I’ve spoken to about their experience as an RA to a more senior EA researcher thought those roles were great for their learning: 1, 2)
This could involve the more experienced researchers delegating some research tasks/projects that they wanted done anyway
This could help align the incentives of the more and less experienced researchers, including incentivising high-quality feedback
This could be paid or unpaid (i.e., volunteering)
One example of a project that arguably serves this purpose is READI
Creating, promoting, and/or engaging with resources on how to more efficiently and effectively seek or provide mentorship, feedback, etc.
E.g., participating in a (non-EA) course on mentorship, coaching, or management, in order to then be better at providing those services to aspiring/junior researchers
Improving the vetting of (potential) researchers, and/or better “sharing” that vetting
For example:
Improving selection processes at EA-aligned research organisations
Increasing the number and usefulness of referrals of candidates from one selection process (e.g., for a job or a grant) to another selection process.
This already happens, but could perhaps be improved by:
Increasing how often it happens
Increasing how well-targeted the referrals are
Increasing the amount of information provided to the second selection process?
Increasing how much of the second selection process the candidate can “skip”?
Creating something like a “Triplebyte for EA researchers”, which could scalably evaluate aspiring/junior researchers, identify talented/promising ones, and then recommend them to hirers/grantmakers[12]
This could resolve most of the vetting constraints if it could operate efficiently and was trusted by the relevant hirers/grantmakers
Increasing and/or improving career advice and/or support with network-building
Examples of existing efforts along these lines include:
Such materials could include courses, workshops, textbooks, standalone writings that are shorter than textbooks (e.g., posts), or sequences of such shorter writings
Topics these materials could focus on include on doing research in general, aspects of doing EA-aligned research that differ from research in other contexts, EA-aligned research using particular disciplines or methodologies, or research on particular EA-relevant topics
These materials could be created by EAs, adapted by EAs from existing things, or commissioned by EAs by created by other people
(Of course, non-EAs left to their own device also make many relevant materials; on how that could be used, see “Increasing and/or improving EAs’ use of non-EA options for research training, credentials, etc.”)
Creating, improving, and/or scaling market-like mechanisms for altruism
This could potentially have benefits such as improving prioritisation and providing a more efficient and scalable system of vetting research projects for funding
Increasing and/or improving the use of relevant online forums
I think many aspiring/junior researchers would benefit from using the EA Forum and/or LessWrong to:
learn about important ideas
discover or think of research questions
find motivation and a sense of accountability for doing research and writing (since they’re doing it for an actual audience)
I also think it would be possible and valuable to increase how often these sites are used—and how useful they are—for those purposes
It seems to me that impressive increases have already occurred since late 2018 (when I first started looked at the Forum)
Increasing the usefulness of these sites could include things like adding new features or integrating these sites with other interventions for improving the EA-aligned research pipeline (e.g., the database idea discussed in my next post)
(But here I should again note that, as with all interventions mentioned in this post, this wouldn’t address all the current imperfections in the EA-aligned research pipeline, nor render all the other interventions unnecessary)
Increasing the number of EA-aligned aspiring/junior researchers
The number of people “entering” or “in” the pipeline doesn’t seem to be as important a bottleneck as some other things (e.g., people with the specific skills necessary for specific projects, capacity to train more such people, capacity to put those people to good use, and capacity to vet people/projects; see Todd, 2020)
But more people in the pipeline would still likely lead to:
more people eventually becoming useful EA-aligned researchers
On the other hand, this comes at the opportunity cost of whatever else these people would’ve spent their time on otherwise
Additionally, more people in the pipeline might have negative consequences other than opportunity cost, such as:
People being turned off EA more generally because of frustration over repeatedly being declined jobs or funding (whereas those people may have found more success in other paths)
Making various forms of coordination, cooperation, and trust harder or less valuable
Leading to more low-quality or incautious work or messaging, reducing the credibility of EA-aligned research communities
Increasing the amount of funding available for EA-aligned research(ers)
As with the number of EA-aligned aspiring/junior researchers, funding for EA-aligned research(ers) doesn’t seem to be as important a bottleneck as some other things, but more funding would still help
Here I’m talking about increasing the funding available for activities whose primary goal is relatively directly leading to directly valuable research
In contrast, increasing the funding available for activities whose primary goal is improving the EA-aligned research pipeline—e.g., by supporting one of the interventions in this post—may better target the key bottlenecks and thus be more valuable
(Of course, many activities may have both types of goals, and sometimes with roughly equal weight)
I’m also only talking about the amount of funding available, not about how much high-priority research actually gets funded, since the latter also depends on other things such as grantmaking capacity and what projects/people are available to be funded
Discovering, writing, and/or promoting positive case studies
If you have thoughts on these interventions or other interventions to achieve a similar goal, or would be interested in supporting such interventions with your time or money, please comment below, send me a message, or fill inthis anonymous form. This could perhaps inform my future efforts, allow me to connect you with other people you could collaborate with or fund, etc.
Though it’s hard to even say what that means, let alone how much anyone should trust my quick rankings; see also the “Caveats and clarifications” section.
I’m using the term “EAs” as shorthand for “People who identify or interact a lot with the EA community”; this would include some people who don’t self-identify as “an EA”.
For example, one could view each of these intervention options through the lens of creating and/or improving “hierarchical network structures” (see What to do with people?).
But I think it would be possible and valuable to do so. E.g., one could find many examples of people who were hired as a researcher at an EA-aligned org, went through an EA-aligned research training program, or did a PhD under a non-EA supervisor; look at what they’ve done since then; and try to compare that to some reasonable guesses about the counterfactual and/or people who seemed similar but didn’t have those experiences. (I know of at least one attempt to do roughly this.) It would of course be hard to be confident about causation and generalisability, but I think we’d still learn more than we know now.
For example, creating, scaling, and/or improving EA-aligned research organisations and doing the same for EA-aligned research training programs might be complementary goods; more of the former means more permanent openings for the “graduates” of those programs, and more of the latter means more skilled, motivated, and vetted candidates for those orgs.
For convenience, I’ll sometimes lump various different types of people together under the label “aspiring/junior researchers”. I say more about this group of people in a previous post of this sequence.
This is based on reading some of what they’ve written about their activities, strategy, and impact assessment; talking to people involved in the project; and my more general thinking about what the EA-aligned research pipeline needs. But I haven’t been an Effective Thesis coach or mentee myself, nor have I tried to carefully evaluate their impact.
Intervention options for improving the EA-aligned research pipeline
See the post introducing this sequence for context, caveats, credits, and links to prior discussion relevant to this sequence as a whole. This post doesn’t necessarily represent the views of my employers.
Summary
In a previous post, I highlighted some observations that I think collectively demonstrate that the current processes by which new EA-aligned research and researchers are “produced” are at least somewhat insufficient, inefficient, and prone to error. In this post, I’ll briefly discuss 19 interventions that might improve that situation. I discuss them in very roughly descending order of how important, tractable, and neglected I think each intervention is, solely from the perspective of improving the EA-aligned research pipeline.[1] The interventions are:
Creating, scaling, and/or improving[2] EA-aligned research orgs
Creating, scaling, and/or improving EA-aligned research training programs (e.g. certain types of internships or summer research fellowships)
Increasing grantmaking capacity and/or improving grantmaking processes
Scaling Effective Thesis, improving it, and/or creating new things sort-of like it
Increasing and/or improving EAs’ use of non-EA options for research training, credentials, etc.[3]
Increasing and/or improving research by non-EAs on high-priority topics
Creating a central, editable database to help people choose and do research projects
Using Elicit (an automated research assistant tool) or a similar tool
Forecasting the impact projects will have
Adding to and/or improving options for mentorship, feedback sources, etc. (including from peers)
Improving the vetting of (potential) researchers, and/or better “sharing” that vetting
Increasing and/or improving career advice and/or support with networking
Reducing the financial costs of testing fit and building knowledge & skills for EA-aligned research careers
Creating and/or improving relevant educational materials
Creating, improving, and/or scaling market-like mechanisms for altruism (e.g., impact certificates)
Increasing and/or improving the use of relevant online forums
Increasing the number of EA-aligned aspiring/junior researchers
Increasing the amount of funding available for EA-aligned research(ers)
Discovering, writing, and/or promoting positive case studies
Feel free to skip to sections that interest you; each section should make sense by itself.
Target audience
As with the rest of this sequence:
This post is primarily intended to inform people who are helping implement or fund interventions to improve the EA-aligned research pipeline, or who could potentially do so in future
But it may also help people who hope to themselves “enter” and “progress through” the EA-aligned research pipeline
(For illustration, I’ve added a comment below this post regarding how my own career, project, and donation decisions have been influenced by thinking about why and how the EA-aligned research pipeline should be improved.)
Caveats and clarifications
Versions of many of these interventions already exist or have already been proposed
There are various other ways to carve up the space of options, various complementary framings that can be useful, etc.[4]
Many of these interventions would also or primarily have benefits unrelated to improving the EA-aligned research pipeline
These interventions differ in their importance (in general or for improving the EA-aligned research pipeline specifically), neglectedness, and tractability
And I haven’t gathered systematic data on those things[5]
Specific versions of a given intervention, or specific combinations of those interventions, would also differ on those variables[6]
Some of these options interventions—or some versions of them—might not actually be worthwhile or even net-positive
These interventions differ in which aspects of the EA-aligned research pipeline they’d (primarily) improve
To keep this post (relatively!) brief, I don’t fully explain or justify all the points I make, nor mention all the points that come to mind regarding each intervention
I’m happy to provide further thoughts in replies to comments
I’m sure I’ve failed to mention some promising intervention options, and I’d welcome comments that mention additional ideas (whether they’re the commenter’s own idea or something that has been proposed elsewhere)
The intervention options
Creating, scaling, and/or improving EA-aligned research orgs
It seems like EA-aligned research orgs should house a substantial fraction of EA-aligned researchers and handle a substantial fraction of vetting, training, etc. for aspiring/junior EA-aligned researchers[7]
Though not necessarily the majority; there is also a place for grantmakers, non-EA orgs, independent research, etc.
There would be more capacity for that if EA-aligned research orgs were more numerous, larger, and/or better
Some things that would help us move towards that situation include:
Org leadership teams consciously thinking about how they can scale gracefully yet relatively quickly, designing their systems and strategies around that, building strong operations teams, hiring with that in mind (e.g., looking for people who could in future manage other hires), and providing staff with opportunities to build their management skills
Individuals trying to build skills in or pursue roles related to management, mentorship, or perhaps operations, and perhaps considering founding new orgs
Funders could try to fund orgs which seem likely to scale gracefully yet relatively quickly (if given more funding), fund the creation of new orgs (especially those that could scale well), and engage in “active funding” to create more such funding opportunities (see also field building)
See also:
Benjamin Todd on what the effective altruism community most needs
This comment thread on creating and/or scaling EA-aligned orgs
Creating, scaling, and/or improving EA-aligned research training programs
See here for posts relevant to this topic, and here for a list of such programs
These programs include things like research internships, summer research fellowships, and some volunteering programs
Examples of efforts to create, scale, and/or improve such programs include:
The creation of SERI
The SERI team’s efforts to encourage and support the creation of programs similar to themselves
My creation of a Slack workspace for people who are (or are planning to be) involved in organising such programs to exchange resources and idas, ask questions, share resources, etc.
For one attempt to assess the impact of such a program, see Review of FHI’s Summer Research Fellowship 2020
Increasing grantmaking capacity and/or improving grantmaking processes
By “grantmaking capacity”, I mean the collective capacity grantmakers and others have to evaluate and/or create[8] funding opportunities
I don’t mean available funding; I have a separate section below on increasing available funding
Relevant individuals include people who work as grantmakers, other people who give donation recommendations, and people who make decisions about where to donate their own money
Ways grantmaking capacity could be increased include hiring or training new grantmakers, increasing the time spent on grantmaking by people who currently do it part-time, distributing funding to other individuals for regranting, and creating or scaling charity evaluators
Increasing grantmaking capacity and/or improving grantmaking processes could improve the EA-aligned research pipeline by increasing the amount and efficiency of financial support for aspiring/junior researchers and/or for work on any of the other interventions discussed in this post
See also Benjamin Todd on what the effective altruism community most needs
Scaling Effective Thesis, improving it, and/or creating new things sort-of like it
I have a quite positive impression Effective Thesis[9]
I tentatively think it’d be good for Effective Thesis to expand in some way, and/or for additional things sort-of like Effective Thesis to be created
But I haven’t really thought about this much yet, and so:
For all I know, it might be the case that Effective Thesis are already doing most of the most valuable and tractable things in this space
I’m not really sure what, specifically, scaling, improving, or creating new things sort-of like Effective Thesis should look like
If this involves new orgs/projects, they could try somewhat different strategies and approaches to those used by Effective Thesis, or specialise more for particular user groups or topic areas
And they could share resources and learnings with Effective Thesis, and vice versa
(My understanding is that this would be analogous to the current situation in the EA-aligned career advice space, where the relevant organisations include 80,000 Hours, Animal Advocacy Careers, and Probably Good)
Increasing and/or improving EAs’ use of non-EA options for research-relevant training, credentials, testing fit, etc.
The next post in this sequence will focus on this idea, so I won’t discuss it here
Increasing and/or improving research by non-EAs on high-priority topics
See also field building
On a somewhat abstract level, this could be done through things like:
Increasing awareness of and inclination towards these topics among non-EAs
Funding work on these topics
Funding the creation of non-EA orgs, institutes, etc. focused on these topics (e.g., CSET[10])
Making it (seem) easier to publish respectable papers on these topics
Running conferences or workshops on these topics
Increasing interactions between EA and non-EA researchers
Providing guidance to non-EA research on these topics
Shifting academic norms and incentives towards choosing research for its impact potential
More concretely, this could be done through things like:
Organising workshops on the topic
Publishing papers on a high-priority topic (which could raise the topic’s salience, make publishing on it seem more acceptable, give people things to cite)
Inviting non-EAs to visit EA research institutes/orgs
Providing the kind of resources and coaching Effective Thesis provides
Scoping EA-aligned research directions in a way that makes them easier for people working in traditional academia to learn about, see the relevance of, connect to established disciplines, and work on
The GovAI and GPI research agendas could be seen as two examples of this sort of effort
Creating prizes or awards for the best research on a topic, and trying to make the prize/award sufficiently large, prestigious, and well-advertised in relevant places that top or promising non-EA researchers are drawn towards it
In addition to improve the pipeline for EA-aligned research produced by non-EAs, this might also improve the pipeline for EA-aligned researchers, such as by:
Causing longer-term shifts in the views of some of the non-EAs reached
Making it easier for EAs’ to use non-EA options for research training, credentials, etc. (see my next post)
And these benefits could perhaps be huge, as the vast majority of all research talent, funding, hours, etc. are outside of EA
On the other hand, it may be less tractable for “us” to increase and/or improve that pool of talent, funding, hours, etc., compared to doing so for the EA pool
Creating a central, editable database to help people choose and do research projects
The sixth post in this sequence will focus on this idea
Using Elicit (an automated research assistant tool) or a similar tool
The sixth post in this sequence will discuss this idea
Forecasting the impact projects will have
The sixth post in this sequence will discuss this idea
Adding to and/or improving options for collaborations, mentorship, feedback, etc. (including from peers)
This could include things like:
Encouraging and facilitating aspiring/junior researchers in connecting with each other to get feedback on plans, get feedback on drafts, collaborate, start coworking teams, and run focused practice sessions[11]
E.g., creating spaces like Effective Altruism Editing and Review
E.g., circulating advice and links like those contained in Notes on EA-related research, writing, testing fit, learning, and the Forum
E.g., perhaps, creating platforms like Impact CoLabs
(Those are just the first three examples that came to mind; there are probably other, quite different ways to achieve this goal)
Encouraging and facilitating aspiring/junior researchers and more experienced researchers to connect in similar ways
This could involve the aspiring/junior researcher acting as a research assistant
(Two out of two people I’ve spoken to about their experience as an RA to a more senior EA researcher thought those roles were great for their learning: 1, 2)
This could involve the more experienced researchers delegating some research tasks/projects that they wanted done anyway
This could help align the incentives of the more and less experienced researchers, including incentivising high-quality feedback
This could be paid or unpaid (i.e., volunteering)
One example of a project that arguably serves this purpose is READI
Creating, promoting, and/or engaging with resources on how to more efficiently and effectively seek or provide mentorship, feedback, etc.
E.g., writing posts like Giving and receiving feedback and Asking for advice
E.g., participating in a (non-EA) course on mentorship, coaching, or management, in order to then be better at providing those services to aspiring/junior researchers
Improving the vetting of (potential) researchers, and/or better “sharing” that vetting
For example:
Improving selection processes at EA-aligned research organisations
Increasing the number and usefulness of referrals of candidates from one selection process (e.g., for a job or a grant) to another selection process.
This already happens, but could perhaps be improved by:
Increasing how often it happens
Increasing how well-targeted the referrals are
Increasing the amount of information provided to the second selection process?
Increasing how much of the second selection process the candidate can “skip”?
Creating something like a “Triplebyte for EA researchers”, which could scalably evaluate aspiring/junior researchers, identify talented/promising ones, and then recommend them to hirers/grantmakers[12]
This could resolve most of the vetting constraints if it could operate efficiently and was trusted by the relevant hirers/grantmakers
Increasing and/or improving career advice and/or support with network-building
Examples of existing efforts along these lines include:
80,000 Hours
Animal Advocacy Careers
Probably Good
Many local EA groups
Parts of what the Improving Institutional Decision-Making working group and the Simon Institute for Longterm Governance do
In particular, my understanding is that these groups help provide some career advice and connections in their particular areas of expertise
Reducing the financial costs of testing fit and building knowledge & skills for EA-aligned research careers
For example, CEEALAR (formerly the EA Hotel) provides free or cheap accommodation and board to people engaging in these sorts of activities
One could set up similar things in other locations, or find other ways to reduce the financial costs of taking time to engage in these activities
Note that here I don’t mean providing funding to support people in doing these activities
That also seems valuable, but is covered in other sections of this post
Creating and/or improving relevant educational materials[13]
Such materials could include courses, workshops, textbooks, standalone writings that are shorter than textbooks (e.g., posts), or sequences of such shorter writings
Existing examples include Charity Entrepreneurship’s writings about their research process, parts of Charity Entrepreneurship’s handbook, posts tagged Research methods, and posts tagged Scholarship & Learning
Topics these materials could focus on include on doing research in general, aspects of doing EA-aligned research that differ from research in other contexts, EA-aligned research using particular disciplines or methodologies, or research on particular EA-relevant topics
These materials could be created by EAs, adapted by EAs from existing things, or commissioned by EAs by created by other people
(Of course, non-EAs left to their own device also make many relevant materials; on how that could be used, see “Increasing and/or improving EAs’ use of non-EA options for research training, credentials, etc.”)
Creating, improving, and/or scaling market-like mechanisms for altruism
See Markets for altruism and Certificates of impact
This could potentially have benefits such as improving prioritisation and providing a more efficient and scalable system of vetting research projects for funding
Increasing and/or improving the use of relevant online forums
I think many aspiring/junior researchers would benefit from using the EA Forum and/or LessWrong to:
learn about important ideas
discover or think of research questions
find motivation and a sense of accountability for doing research and writing (since they’re doing it for an actual audience)
disseminate their findings/ideas
get feedback
find collaborators
form connections
etc.
See also Reasons for and against posting on the EA Forum
I also think it would be possible and valuable to increase how often these sites are used—and how useful they are—for those purposes
It seems to me that impressive increases have already occurred since late 2018 (when I first started looked at the Forum)
Increasing the usefulness of these sites could include things like adding new features or integrating these sites with other interventions for improving the EA-aligned research pipeline (e.g., the database idea discussed in my next post)
(But here I should again note that, as with all interventions mentioned in this post, this wouldn’t address all the current imperfections in the EA-aligned research pipeline, nor render all the other interventions unnecessary)
My post Notes on EA-related research, writing, testing fit, learning, and the Forum is an example of an effort to increase and improve the use of relevant online forums, and also links to other examples of such an effort
Increasing the number of EA-aligned aspiring/junior researchers
The number of people “entering” or “in” the pipeline doesn’t seem to be as important a bottleneck as some other things (e.g., people with the specific skills necessary for specific projects, capacity to train more such people, capacity to put those people to good use, and capacity to vet people/projects; see Todd, 2020)
But more people in the pipeline would still likely lead to:
more people eventually becoming useful EA-aligned researchers
more fitting people being selected for the EA-aligned research roles/funding that would’ve been available anyway (since there’s a larger pool of people to select from; see also How replaceable are the top candidates in large hiring rounds?)
On the other hand, this comes at the opportunity cost of whatever else these people would’ve spent their time on otherwise
Additionally, more people in the pipeline might have negative consequences other than opportunity cost, such as:
People being turned off EA more generally because of frustration over repeatedly being declined jobs or funding (whereas those people may have found more success in other paths)
Making various forms of coordination, cooperation, and trust harder or less valuable
Leading to more low-quality or incautious work or messaging, reducing the credibility of EA-aligned research communities
(See also value of movement growth)
Increasing the amount of funding available for EA-aligned research(ers)
As with the number of EA-aligned aspiring/junior researchers, funding for EA-aligned research(ers) doesn’t seem to be as important a bottleneck as some other things, but more funding would still help
Here I’m talking about increasing the funding available for activities whose primary goal is relatively directly leading to directly valuable research
In contrast, increasing the funding available for activities whose primary goal is improving the EA-aligned research pipeline—e.g., by supporting one of the interventions in this post—may better target the key bottlenecks and thus be more valuable
(Of course, many activities may have both types of goals, and sometimes with roughly equal weight)
I’m also only talking about the amount of funding available, not about how much high-priority research actually gets funded, since the latter also depends on other things such as grantmaking capacity and what projects/people are available to be funded
Discovering, writing, and/or promoting positive case studies
Discussed in a comment below this post
If you have thoughts on these interventions or other interventions to achieve a similar goal, or would be interested in supporting such interventions with your time or money, please comment below, send me a message, or fill in this anonymous form. This could perhaps inform my future efforts, allow me to connect you with other people you could collaborate with or fund, etc.
Though it’s hard to even say what that means, let alone how much anyone should trust my quick rankings; see also the “Caveats and clarifications” section.
Note that even good things can be made better!
I’m using the term “EAs” as shorthand for “People who identify or interact a lot with the EA community”; this would include some people who don’t self-identify as “an EA”.
For example, one could view each of these intervention options through the lens of creating and/or improving “hierarchical network structures” (see What to do with people?).
But I think it would be possible and valuable to do so. E.g., one could find many examples of people who were hired as a researcher at an EA-aligned org, went through an EA-aligned research training program, or did a PhD under a non-EA supervisor; look at what they’ve done since then; and try to compare that to some reasonable guesses about the counterfactual and/or people who seemed similar but didn’t have those experiences. (I know of at least one attempt to do roughly this.) It would of course be hard to be confident about causation and generalisability, but I think we’d still learn more than we know now.
For example, creating, scaling, and/or improving EA-aligned research organisations and doing the same for EA-aligned research training programs might be complementary goods; more of the former means more permanent openings for the “graduates” of those programs, and more of the latter means more skilled, motivated, and vetted candidates for those orgs.
For convenience, I’ll sometimes lump various different types of people together under the label “aspiring/junior researchers”. I say more about this group of people in a previous post of this sequence.
See “active funding”. See also field building.
This is based on reading some of what they’ve written about their activities, strategy, and impact assessment; talking to people involved in the project; and my more general thinking about what the EA-aligned research pipeline needs. But I haven’t been an Effective Thesis coach or mentee myself, nor have I tried to carefully evaluate their impact.
The original Director of CSET and several of its staff have been involved in the EA community, but many other members of staff are not involved in EA.
See, for example, Learnings about literature review strategy from research practice sessions.
This idea was suggested as a possibility by Peter Hurford. See some thoughts on the idea here.
I’m grateful to Edo Arad for suggesting I include roughly this intervention idea.