Projects I’d like to see

We’ve just launched the Effective Altruism Grants program to help people put promising ideas into practice. I’m hoping that the program will enable some people to transition onto higher-impact paths that they otherwise wouldn’t have been able to pursue.

Here I’m going to list some applications I’d personally like to see. The list of ideas isn’t close to exhaustive, and you’re not guaranteed funding if you apply with one of these ideas. And I’m not claiming that any particular version of these ideas is good. But they represent some projects I’m potentially excited about, depending on execution. For some of them, I’d be happy to provide mentorship in order to help them succeed. More potential ideas are listed on the Effective Altruism Grants webpage. Note that CEA might not be able to fund all of the following types of projects, but we may share promising proposals that we are unable to fund with our partners.


General types of applications I’d like to see

Further study

You need to pursue graduate study in order to move into an impactful line of work.

Exploring a career switch

You think you could do more good in a career other than the one you’re currently in, but you’re not sure what, exactly, is the best alternative. Funding for around three months might allow you to do internships, make applications, and get advice from people. You’d like to do this, but you can’t afford it.

Earning-to-give buy-out

You’re currently earning to give, because you think that your donations are doing more good than your direct work would. It might be that we think that it would be more valuable if you did direct work. If so we could donate a proportion of the amount that you were donating to wherever you were donating it, and you would move into work.

Buying research time

You’re a professor and could spend more time on impactful research if you were bought out of your teaching and administrative duties.

Unpaid internships

You have an opportunity to do an unpaid internship, but couldn’t otherwise afford it.

New organisation

You have an idea for a new non-profit or for-profit organisation, and need some startup funding to test it out.

Running a local group

You’re currently leading a local group, and would like to run it full-time.

More specific ideas that I’d like to see

EA Outreach and Community

I’d be excited to see people moving into part-time or full-time positions running local groups. For instance, perhaps someone is a successful local group leader while a student, and feel they could continue that work full-time after they graduate.

I’d be excited to see applications from countries where we don’t currently have a large presence. For instance, we don’t have much of a presence in China, even though it’s very likely that it will be one of the most important and influential countries over the 21st century. There are big challenges to adapting EA to resonate with Chinese culture, but I’d be particularly excited to see applications aimed at trying to figure out how to address those challenges.

With respect to local groups, I’d love to see group leaders trying out new activities and then writing up an assessment. If such experiments are successful, they could be rolled out to other local groups. (The Oxford Prioritisation Project is a recent example of this—a write-up of their project is coming soon.)

A few ideas I’d like to see tested are as follows:

Anti-Debates

Debating is a very common activity at universities, but the usual style of debating is antithetical to the EA approach to reasoning. The aim is to defend a particular point of view, rather than to figure out what the truth is. It’s combative rather than collaborative, and rhetoric tends to take precedence over evidence and logic.

Instead, we could run “anti-debates”, where two people publicly discuss a topic, stating their views at the outset. They get scored by a panel of judges on a set of criteria that we believe to be genuinely epistemically valuable, such as:

Quality and breadth of arguments given

Understanding of the opposite point of view (and avoidance of ‘straw man’)

Appropriate degree of belief given the level of evidence at hand

Willingness to change your mind in face of contrary argument

Prediction tournaments

You lead a group which gets together to make forecasts, with real money on the line, in order to improve your forecasting ability. You might share your predictions with others, to help inform their decisions.

Dragon’s Den/​Shark Tank-style career choice discussions

You lead a group which gets together every week. Each week, one of the members has to stand up in front of everyone and outline your career plans, explaining why you’re choosing what you’re choosing, and why that’s the best way for you to do the most good. People would then debate with you whether you’re choosing the right path. A variant would be the ‘reciprocity ring’. where people offer you any help they can (such as things to read, or introductions), or ‘peer coaching’ networks, where people can mentor each other to talk through their career plans and offer advice.

Research working groups

A group of you could work on a shared research project, over the course of a semester. This could be on cause prioritisation, or on a specific topic of EA importance (e.g. going through the GiveWell charity cost-effectiveness models and criticising them or investigating what the best policy is within a certain area).

Specific skill-building

I worry that at the moment too many of the most dedicated community members are building general-purpose skills, such as by going into consulting, rather than getting skills in particular areas that are underrepresented within the effective altruism movement.

This could include graduate level study in biology, machine learning, economics, or political science, taking up fellowships at a think-tank, or going into government. For those with a quantitative PhD, it could involve applying for the Google Brain Residency program or AI safety fellowship at ASI.

New organisations

I’d love to see people making a concerted effort to develop EA in new areas. One example would be a think-tank, where people would work out what policies look most promising from an EA perspective. (There are risks involved in this area—in particular of EA becoming partisan—so I think that at this stage the best approach would be research and investigation, rather than activism.) Another would be a GiveWell for impact investing, where you could search for the best impact investing opportunities from an EA perspective.

Writing

I’d be keen to see more long-form writing done on EA topics, whether for blogs, mainstream media, or books. In general, I’m much more interested by deep substantial pieces of writing rather than short think-pieces. Topics could include:

Cause prioritisation

CRISPR and eradicating malaria

What life is really like on $1.90 per day

Geoengineering

Pandemics from novel pathogens

Open borders

Wild animal suffering

Further analysis of common ways of doing good (e.g. recycling, fair trade, divestment, or campaigns) in terms of their effectiveness.

Often there’s already quite a lot of high-quality material on these topics, scattered across blogs and research articles. What’s needed is for someone to gather together those materials and write a single go-to introduction to the topic. (To a significant extent that’s what Doing Good Better was doing.)

I’d be keen to see more people take ideas that we think we already know, but haven’t ever been put down in writing, and write them up in a thorough and even-handed way; for example, why existential risk from anthropogenic causes is greater than the existential risk from natural causes, or why global health is a particularly promising area within global development.

For younger writers, one strategy could be to co-author a book with an established academic. They might have produced a body of research on an important topic, but not be very good at or very interested in writing clearly for a wider audience. In which case, you could suggest to them that you could produce a co-authored book on their topic.