Community Organiser for EA UK
Organiser for EA Finance
It’s not necessarily that the “EA” jobs are more poorly paid, just that the people that take these roles could realistically earn much more elsewhere.
One way to think about it is that the aim of EA is to benefit the beneficiaries—the poorest people in the world, animals, future beings.
We should choose strategies that help the beneficiaries the most rather than strategies that help people that happen to be interested in EA (unless that also helps the beneficiaries—things like not burning out).
It makes sense to me that we should ask of those who have had the most privilege to give back the most, if you have more money you should give more of it away. If you have a stronger safety net and access to influence, you should use more of that to help others rather than helping yourself.
I think with the salaries, for most people, they could probably earn more in other sectors if they only cared about monetary gain rather than including impact in their career choice. If you’re coming from a charity/public service sector they may seem higher, if you’re coming from a private sector career they seem lower.
Looking at the grants database for 2023, there seems to be only 24 projects listed there for a total of ~$204k, which is less than 10% of the money said to be granted in 2023.
Including the 2022 Q4-2 tag, there are now 54 projects with grants totalling $1,170,000 (although this does include some of the examples above). I don’t know how many of these grants are included with the total sum given in the original post.
The ten largest grants were:-
$126k − 12-month part-time salary for 2 organisers, equipment and other expenses, to expand EA community building in Hong Kong
$114k − 8-month programme helping ambitious graduates to launch EU policy careers focused on emerging tech
$86k—Further develop the fast growing Dutch platform for effective giving until April 2023
$63k—Grant renewal of “A Happier World”: Salary and funding to continue producing video content
$62k − 12 months month salary for EA for Jews’ Managing Director
$57k—Yearly salary for weekly written summaries of the top EA and LW forum posts, and a human-narrated podcast for the former
$50k − 6 month salary and minor project expenses for career exploration, focused on biosecurity projects
$50k − 6 months of funding to scale our robo-advisor app for charitable giving
$50k—To grow the readership of a Substack on forecasting enough to fund it with reader donations while keeping content free
$45k − 1 year of 2.5 FTE salary split across 5 people to do community building work for EA Philippines + student chapters
I think this has been thought about a few times since EA started.
In 2015 Max Dalton wrote about medical research and said the below.
“GiveWell note that most funders of medical research more generally have large budgets, and claim that ‘It’s reasonable to ask how much value a new funder – even a relatively large one – can add in this context’. Whilst the field of tropical disease research is, as I argued above, more neglected, there are still a number of large foundations, and funding for several diseases is on the scale of hundreds of millions of dollars. Additionally, funding the development of a new drug may cost close to a billion dollars .
For these reasons, it is difficult to imagine a marginal dollar having any impact. However, as Macaskill argues at several points in Doing Good Better, this appears to only increase the riskiness of the donation, rather than reducing its expected impact.
In 2018 Peter Wildeford and Marcus A. Davis wrote about the cost effectiveness of vaccines and suggested that a malaria vaccine is competitive with other global health opportunities.
There are various posts about volunteering here.
I’ve linked some below that might be the most relevant.
Volunteering isn’t free
Effective Volunteering
What is a good answer for people new to EA that request advice on volunteering?
Why You Should Consider Skilled Volunteering
Also the $70 billion on development assistance for health doesn’t include other funding that contributes to development.
$100b+ on non health development
$500b+ remittances
Harder to estimate but over a trillion spent by LMICs on their own development and welfare
The Panorama episode briefly mentioned EA. Peter Singer spoke for a couple of minutes and EA was mainly viewed as charity that would be missing out on money. There seemed to be a lot more interest on the internal discussions within FTX, crypto drama, the politicians, celebrities etc.
Maybe Panorama is an outlier but potentially EA is not that interesting to most people or seemingly too complicated to explain if you only have an hour.
I’ve written a bit about this here and think that they would both be better off if they were more distinct.
As AI safety has grown over the last few years there may have been missed growth opportunities from not having a larger separated identity.
I spoke to someone at EAG London 2023 who didn’t realise that AI safety would get discussed at EAG until someone suggested they should go after doing an AI safety fellowship. There are probably many examples of people with an interest in emerging tech risks who would have got more involved at an earlier time if they’d been presented with those options at the beginning.
In 2015, one survey found 44% of the American public would consider AI an existential threat. In February 2023 it was 55%.
I’ve written about this idea before FTX and think that FTX is a minor influence compared to the increased interest in AI risk.
My original reasoning was that AI safety is a separate field but doesn’t really have much movement building work being put into it outside of EA/longtermism/x-risk framed activities.
Another reason why AI takes up a lot of EA space, is that there aren’t many other places to go to discuss these topics, which is bad for the growth of AI safety if it’s hidden behind donating 10% and going vegan and bad for EA if it gets overcrowded by something that should have it’s own institutions/events/etc.
If the definition of being more engaged includes going to EAG and being a member of a group, aren’t some of these results a bit circular?
EA isn’t a political party but I still think it’s an issue if the aims of the keenest members diverges from the original aims of the movement, especially if the barrier to entry to be a member is quite low compared to being in an EA governance position. I would worry that the people who would bother to vote would have much less understanding of what the strategic situation is than the people who are working full time.
Maybe we have had different experiences, I would say that the people who turn up to more events are usually more interested in the social side of EA. Also there are lot of people in the UK who want to have impact and have a high interest in EA but don’t come to events and wouldn’t want to pay to be a member (or even sign up as a member if it was free).
I think people can still hold organisations to account and follow the money, even if they aren’t members, and this already happens in EA, with lots of critiques of different organisations and individuals.
I think one large disadvantage of a membership association is that it will usually consist of the most interested people, or the people most interested in the social aspect of EA. This may not always correlate with the people who could have the most impact, and creates a definitive in and out.
I’d be worried about members voting for activities that benefit them the most rather than the ultimate beneficiaries (global poor, animals, future beings).
Olivia Fox Cabane has an alt protein industry landscape map.
A separate organisation just for CBGs would have been useful too rather than a lot of one and two person teams with constant turnover.
I thought about this briefly a few months ago and came up with these ideas.
CEA—incubate CBG groups as team members until they are registered as separate organisations with their own operations staff
CEA but for professional EA network building (EA Consulting network, High Impact Engineers, Hi-Med, etc). They are even more isolated than CBGs which have some support from CEA
Rethink Priorities—One of the incubated orgs could do similar work to EV Ops (which is maybe what the special projects team is doing already, but it might be good to have something more separate from RP, or a cause specific support org (animal advocacy/AI safety, biosecurity)
EV Ops—Spin out 80k/GWWC to increase capacity for other smaller orgs
Open Phil—Some of their programs might work better with project managers rather than individuals getting grants (e.g. the century fellowship)
Also looking at local groups, there is some coordination on the groups slack and some retreats but there is still a lot of duplication and a high rate of turnover which limits any sustained institutional knowledge.
I didn’t vote but there has been discussion of issues in richer countries that received votes but the author pointed out how it fit into the context of effective altruism.
There have also been posts about mass media interventions but they generally refer to stronger evidence for their effectiveness.
Thanks for diving into the data David, I think a lot of this might hinge on the ‘highly engaged EAs’ metric and how useful that is for determining impact vs how much someone has an interest in EA.
Are you also able to see if there are differences between different types of local groups (National/City/University/interest)?
I would go further and say that more people are interested in specific areas like AI safety and biosecurity than the general framing of x-risks. Especially senior professionals that have worked in AI/bio careers.
There is value for some people to be working on x-risk prioritisation but that would be a much smaller subset than the eventual sizes of the cause specific fields.
You mention this in your counterarguments but I think that it should be emphasised more.
There are a lot of private sector community roles, some with salaries up to $180k—Here are some examples from a community manager job board.