Projects I’d like to see
We’ve just launched the Effective Altruism Grants program to help people put promising ideas into practice. I’m hoping that the program will enable some people to transition onto higher-impact paths that they otherwise wouldn’t have been able to pursue.
Here I’m going to list some applications I’d personally like to see. The list of ideas isn’t close to exhaustive, and you’re not guaranteed funding if you apply with one of these ideas. And I’m not claiming that any particular version of these ideas is good. But they represent some projects I’m potentially excited about, depending on execution. For some of them, I’d be happy to provide mentorship in order to help them succeed. More potential ideas are listed on the Effective Altruism Grants webpage. Note that CEA might not be able to fund all of the following types of projects, but we may share promising proposals that we are unable to fund with our partners.
General types of applications I’d like to see
Further study
You need to pursue graduate study in order to move into an impactful line of work.
Exploring a career switch
You think you could do more good in a career other than the one you’re currently in, but you’re not sure what, exactly, is the best alternative. Funding for around three months might allow you to do internships, make applications, and get advice from people. You’d like to do this, but you can’t afford it.
Earning-to-give buy-out
You’re currently earning to give, because you think that your donations are doing more good than your direct work would. It might be that we think that it would be more valuable if you did direct work. If so we could donate a proportion of the amount that you were donating to wherever you were donating it, and you would move into work.
Buying research time
You’re a professor and could spend more time on impactful research if you were bought out of your teaching and administrative duties.
Unpaid internships
You have an opportunity to do an unpaid internship, but couldn’t otherwise afford it.
New organisation
You have an idea for a new non-profit or for-profit organisation, and need some startup funding to test it out.
Running a local group
You’re currently leading a local group, and would like to run it full-time.
More specific ideas that I’d like to see
EA Outreach and Community
I’d be excited to see people moving into part-time or full-time positions running local groups. For instance, perhaps someone is a successful local group leader while a student, and feel they could continue that work full-time after they graduate.
I’d be excited to see applications from countries where we don’t currently have a large presence. For instance, we don’t have much of a presence in China, even though it’s very likely that it will be one of the most important and influential countries over the 21st century. There are big challenges to adapting EA to resonate with Chinese culture, but I’d be particularly excited to see applications aimed at trying to figure out how to address those challenges.
With respect to local groups, I’d love to see group leaders trying out new activities and then writing up an assessment. If such experiments are successful, they could be rolled out to other local groups. (The Oxford Prioritisation Project is a recent example of this—a write-up of their project is coming soon.)
A few ideas I’d like to see tested are as follows:
Anti-Debates
Debating is a very common activity at universities, but the usual style of debating is antithetical to the EA approach to reasoning. The aim is to defend a particular point of view, rather than to figure out what the truth is. It’s combative rather than collaborative, and rhetoric tends to take precedence over evidence and logic.
Instead, we could run “anti-debates”, where two people publicly discuss a topic, stating their views at the outset. They get scored by a panel of judges on a set of criteria that we believe to be genuinely epistemically valuable, such as:
Quality and breadth of arguments given
Understanding of the opposite point of view (and avoidance of ‘straw man’)
Appropriate degree of belief given the level of evidence at hand
Willingness to change your mind in face of contrary argument
Prediction tournaments
You lead a group which gets together to make forecasts, with real money on the line, in order to improve your forecasting ability. You might share your predictions with others, to help inform their decisions.
Dragon’s Den/Shark Tank-style career choice discussions
You lead a group which gets together every week. Each week, one of the members has to stand up in front of everyone and outline your career plans, explaining why you’re choosing what you’re choosing, and why that’s the best way for you to do the most good. People would then debate with you whether you’re choosing the right path. A variant would be the ‘reciprocity ring’. where people offer you any help they can (such as things to read, or introductions), or ‘peer coaching’ networks, where people can mentor each other to talk through their career plans and offer advice.
Research working groups
A group of you could work on a shared research project, over the course of a semester. This could be on cause prioritisation, or on a specific topic of EA importance (e.g. going through the GiveWell charity cost-effectiveness models and criticising them or investigating what the best policy is within a certain area).
Specific skill-building
I worry that at the moment too many of the most dedicated community members are building general-purpose skills, such as by going into consulting, rather than getting skills in particular areas that are underrepresented within the effective altruism movement.
This could include graduate level study in biology, machine learning, economics, or political science, taking up fellowships at a think-tank, or going into government. For those with a quantitative PhD, it could involve applying for the Google Brain Residency program or AI safety fellowship at ASI.
New organisations
I’d love to see people making a concerted effort to develop EA in new areas. One example would be a think-tank, where people would work out what policies look most promising from an EA perspective. (There are risks involved in this area—in particular of EA becoming partisan—so I think that at this stage the best approach would be research and investigation, rather than activism.) Another would be a GiveWell for impact investing, where you could search for the best impact investing opportunities from an EA perspective.
Writing
I’d be keen to see more long-form writing done on EA topics, whether for blogs, mainstream media, or books. In general, I’m much more interested by deep substantial pieces of writing rather than short think-pieces. Topics could include:
Cause prioritisation
CRISPR and eradicating malaria
What life is really like on $1.90 per day
Geoengineering
Pandemics from novel pathogens
Open borders
Wild animal suffering
Further analysis of common ways of doing good (e.g. recycling, fair trade, divestment, or campaigns) in terms of their effectiveness.
Often there’s already quite a lot of high-quality material on these topics, scattered across blogs and research articles. What’s needed is for someone to gather together those materials and write a single go-to introduction to the topic. (To a significant extent that’s what Doing Good Better was doing.)
I’d be keen to see more people take ideas that we think we already know, but haven’t ever been put down in writing, and write them up in a thorough and even-handed way; for example, why existential risk from anthropogenic causes is greater than the existential risk from natural causes, or why global health is a particularly promising area within global development.
For younger writers, one strategy could be to co-author a book with an established academic. They might have produced a body of research on an important topic, but not be very good at or very interested in writing clearly for a wider audience. In which case, you could suggest to them that you could produce a co-authored book on their topic.
- List of possible EA meta-charities and projects by 9 Jan 2019 11:28 UTC; 74 points) (
- EA-Aligned Impact Investing: Mind Ease Case Study by 15 Nov 2021 15:57 UTC; 71 points) (
- Annotated List of Project Ideas & Volunteering Resources by 6 Jul 2020 3:29 UTC; 57 points) (
- Bi-Weekly Rational Feed by 24 Jun 2017 0:07 UTC; 35 points) (LessWrong;
- Debate and Effective Altruism: Friends or Foes? by 10 Nov 2018 18:33 UTC; 33 points) (
- 2 Mar 2023 15:46 UTC; 11 points) 's comment on david_reinstein’s Quick takes by (
- What are some EA projects for students to do? by 17 Nov 2020 12:47 UTC; 7 points) (
- 27 Mar 2017 2:23 UTC; 5 points) 's comment on Concrete project lists by (
- 7 Mar 2020 2:53 UTC; 2 points) 's comment on Concrete project lists by (
See also Concrete project lists and the additional suggestions in the comments there.
Entering China would be awesome. So many people with money and no one’s donating it. It ranks dead freaking last on the World Giving Index. Which in a way is a good thing… it means lots of room to grow!
China’s domestic charities are usually operated and funded by the government (basically part of the government). And starting this year, the government has basically taken control of foreign NGO’s in China.
Often, rich Chinese elect to donate to foreign NGOs because they are more credible. Besides, being government-controlled, charities in China are not known for being reputable, prompting billionaire Jack Ma to famously quip “It’s harder to donate money to Chinese charities than to earn it.” The China Foundation Center was created a few years ago to promote transparency in the nonprofit sector.
India is also a good target. Like China, no one there trusts charities. Probably because they’re all scams? But there is an organization called Credibility Alliance that accredits the more transparent ones. I’m a big fan of Transparency International India. They do so much on a shoestring in the single most important issue in the country (corruption), and are the most credible/transparent.
I was a bit confused by some of these. Posting questions/comments here in case others have the same thoughts:
This made more sense to me after I realised that we should probably assume the person doesn’t think CEA is a top donation target. Otherwise they would have an empirical disagreement about whether they should be doing direct work, and it’s not clear how the offer helps resolve that (though it’s obviously worth discussing).
These are all things that might be good, but it’s not obvious how funding would be a bottleneck. Might be worth saying something about that?
Similarly I’m confused what the funding is meant to do in these cases.
I think you were using this as an example of the type of work, rather than a specific request, but some readers might not know that there’s a paper forthcoming on precisely this topic (if you mean something different from that paper, I’m interested to know what!).
Thanks Owen!
Re Etg buy-out—yes, you’re right. For people who think that CEA is a top donation target, hopefully we could just come to agreement as a trade wouldn’t be possible, or would be prohibitively costly (if there were only slight differences in our views on which places were best to fund).
Re local group activities: These are just examples of some of the things I’d be excited about local groups doing, and I know that at least some local groups are funding constrained (e.g. someone is running them part-time, unpaid, and will otherwise need to get a job).
Re AI safety fellowship at ASI—as I understand it, that is currently funding constrained (they had great applicants who wanted to take the fellowship but ASI couldn’t fund it). For other applications (e.g. Google Brain) it could involve, say, spending some amount of time during or after a physics or math PhD in order to learn some machine learning and be more competitive.
Re anthropogenic existential risks—ah, I had thought that it was only in presentation form. In which case: that paper is exactly the sort of thing I’d love to see more of.
In terms of Anti-Debates/Shark Tank etc
These might be things local groups organise, but wouldn’t make a plan and evaluate unless they had more time to do that.
Can you address the unanswered question in the announcement thread regarding EA Ventures?
Additionally, is the money already raised for this? That was the major shortcoming with the previous iteration.
Yes, the money is raised; we have a pot of £500,000 in the first instance.
It is a successor to EA Ventures, though EA Grants already has funding, and is more focused on individuals than start-up projects.
I agree that growing EA in China will be important, given China’s increasing wealth, clout, confidence, and global influence. If EA fails to reach a critical mass in China, its global impact will be handicapped in 2 to 4 decades. But, as Austen Forrester mentioned in another comment, the charity sector may not be the best beachhead for a Chinese EA movement.
Some other options: First, I imagine China’s government would be motivated to thinking hard about X-risks, particularly in AI and bioweapons—and they’d have the decisiveness, centralized control, and resources to really make a difference. If they can build 20,000 miles of high-speed rail in just one decade, they could probably make substantial progress on any challenge that catches the Politburo’s attention. Also, they tend to take a much longer-term perspective than Western ‘democracies’, planning fairly far into the mid to late 21st century. And of course if they don’t take AI X-risk seriously, all other AI safety work elsewhere may prove futile.
Second, China is very concerned about ‘soft power’—global influence through its perceived magnanimity. This is likely to happen through government do-gooding rather than from private charitable donations. But gov’t do-gooding could be nudged into more utilitarian directions with some influence from EA insights—e.g. China eliminating tropical diseases in areas of Africa where it’s already a neocolonialist resource-extraction power, or reducing global poverty or improving governance in countries that could become thriving markets for its exports.
Third, lab meat & animal welfare: China’s government knows that a big source of subjective well-being for people, and a contributor to ‘social stability’, is meat consumption. They consume more than half of all pork globally, and have a ‘strategic pork reserve’: https://www.cnbc.com/id/100795405. But they plan to reduce meat consumption by 50% for climate change reasons: https://www.theguardian.com/world/2016/jun/20/chinas-meat-consumption-climate-change This probably creates a concern for the gov’t: people love their pork, but if they’re told to simply stop eating it in the service of reducing global warming, they will be unhappy. The solution could be lab-grown meat. If China invested heavily in that technology, they could have all the climate-change benefits of reduced livestock farming, but people wouldn’t be resentful and unhappy about having to eat less meat. So that seems like a no-brainer to get the Chinese gov’t interested in lab meat.
Fourth, with rising affluence, young Chinese middle-class people are likely to have the kind of moral/existential/meaning-of-life crises that hit the US baby boomers in the 1960s. They may be looking for something genuinely meaningful to do with their lives beyond workaholism & consumerism. I think 80k hours could prove very effective in filling this gap, if it developed materials suited to the Chinese cultural, economic, and educational context.
I didn’t mean to imply that it was hopeless to increase charitable giving in China, rather the opposite, that it’s so bad it can only go up! Besides that, I agree with all your points.
The Chinese government already provides foreign aid in Africa to make it possible to further their interests in the region. I was thinking of how we could possibly get them to expand it. The government seems almost impossible to influence, but perhaps EAs could influence African governments to try to solicit more foreign aid from China? It could have a negative consequence, however, in that receiving more aid from China may make Africa more susceptible to accepting bad trade deals, etc.
I don’t know how to engage with China, but I do strongly feel that it holds huge potential for both altruism and also GCRs, which shouldn’t be ignored. I like CEA’s approach of seeking expertise on China generalist experts. There are a number of existing Western-China think tanks that could be useful to the movement, but I think that a “China czar” for EA is a necessity.
I am interested in the Anti-Debates topic, specifically how the format would work. Is there a good place for me to follow up on that? I’d be willing to work with or even help administer any relevant groups or online forum. Thanks.
This advertisement for a Faculty Ethics Bowl on investment in the far future made me think of the anti-debate type concept. It’s not exactly that, but they say: “But this won’t be your ordinary run-of-the-mill debate. Ethics Bowl is very different from traditional debate formats. The teams are docked for using rhetoric, spin, aggression, and clever rationalization. Instead, each team is judged on the basis of active listening, flexibility, collaboration, and analytical rigor—essential ingredients for a meaningful discussion on difficult topics.”
See https://news.ucsc.edu/2019/05/faculty-ethics-bowl.html
Have you considered combining the “GiveWell for impact investing” idea with the Effective Altruism Funds idea and create an EA impact investing biz within your charity? You could hire staff to find the best impact investing opportunities and create a few funds for different risk tolerances. Theoretically, it could pay for itself (or make serious money for CEA if successful enough) with a modest management fee. I’m not sure if charities are allowed to grant to businesses, but I know they can operate their own businesses as long as it’s related to their mission.
FYI, the subject of unification versus diversity is one the EoST community debates with great frequency and vigour: bio mimicry may suggest that diversity is nature’s way of helping us survive …
However, for unity of purpose, some useful umbrellas are: Global Abundance; Education; Health; Eco Sustainability …
EoST?
I’m an EA London member, wondering how to contact William or just EA London on the subjects of new organisations and also anti debating …
Please do have a look at Ecology of Systems Thinking—EoST on Facebook—where I act as (unpaid) moderator. Much of the thinking William talks of is present, and several projects we are trying to initiate are emerging via a very early stage “network corporate” …
I recently listed thirty odd projects that could be useful, given we trial them to some extent, and I’m sure there are students who would be interested to explore the ideas a little further for interesting and hopefully relevant project work.
I’m also trying to use Teal principles wherever possible, and we seek to continue including members many reckon are on the autistic spectrum.
It’s not very effective, yet, but pretty altruistic. Needless to say we are working on ways to get more effective.
I agree on the writing being scattered. Task 1) is: get the writing on a given topic into a single place. That still leaves task 2) get all those collated writings into a single place.
On 2) it strikes me it would be good if CEA compiled a list of EA-relevant resources. An alternative would be someone be someone creating an edited collection of the best recent EA work on a range of topics. Or if we have an academic EA global, then treating that like a normal academic conference and publishing the presented papers.
There’s a Google spreadsheet called the EA Database with hundreds of links, all ordered into categories and subcategories. Do you have access to this?