Community Organiser for EA UK
Organiser for EA Finance
There is a post about this (although it was written in 2015).
There are some good reasons for why large donors would want to not give too much money to a charity at once:
Avoiding excessive reserves: Because of the opportunity costs (other charities could use money productively sooner), it is undesirable to have a charity having excessive reserves. Ideally, they would be promised a steady stream of funding if they meet specific targets over many years in order for them to be able to plan ahead.
Risk diversification: Funds should be distributed to several high impact organisations in order to diversify the risk of one of them not performing well.
Incentivizing others to join the cause area:
Countries: By restricting funding to a particular country, one incentivizes the country to invest in very effective health interventions themselves and use their (often very limited) domestic resources to close the funding gap between donations and the full cost of delivering effective health interventions. Poorer, low-income countries (such as Ethiopia) are less able to do this than low-to-middle income countries (such as India).
Charities: By restricting funding to charities, they’re being kept on their toes, so that they do not rely on a particular foundation or big grant giver exclusively and apply for other grants. For instance, in the past, the Gates foundation has heavily funded the Schistosomiasis Control Initiative. However, Gates later discontinued SCI’s funding not because of too little effectiveness, but because, since their effectiveness had been established, other funders would more readily fund them.
Other donors: By restricting funding to particular charities, other donors are incentivized to also invest in the effective charities. For instance, the Against Malaria foundation has a broader appeal to small private donors than more high-expected-value interventions. Thus, even though theoretically, the Gates foundation, which is the largest private foundation in the world with an endowment of US$42.9 billion[4], could buy every person in Africa a bednet every two years (population of Africa (1 Billion) * Cost of Bednet (5 Dollars) = 5 Billion dollars) that would rapidly deplete their limited resources and then they could not spend their money on other very effective causes. They might reason that (small) more risk-averse donors (who want to be certain that their money will have an impact) will close the funding gap of very effective and established interventions and that they can instead spend more money on riskier, high expected value areas.
Technological Innovation: New technological innovations—such as a very effective malaria vaccine—might be discovered, and these might be more cost-effective.
High risk, high reward project:
CGD has a different take on this type of migration.
”Between the start of 2021 and 2022, the number of Nigerian-born nurses joining the UK nursing register more than quadrupled, an increase of 2,325. Becker’s human capital theory would suggest that this increase in the potential wages earned by Nigerian-trained nurses should lead to an increase in Nigerians choosing to train as nurses. So what happened? Between late 2021 and 2022, the number of successful national nursing exam candidates increased by 2,982—that is, more than enough to replace those who had left for the UK.”
″To fully realise these benefits, Nigeria would need to embrace emigration, realising that nurses are likely going to leave anyway and doing everything they can to reap the benefits. Yet, they appear to be doing the opposite. New guidelines announced on 7 February 2024 state that nurses must work for two and half years before being allowed to work overseas, a move nurses contest. This policy is far from optimal; restrictions on emigration are inefficient, inequitable, and unethical. Indeed, Ghana had a similar scheme, but ended up scrapping it because they were unable to employ all of their nurse trainees at home.”
I remember the ‘subforums’ being more like chat rooms in their user design than actual sub forums which you can navigate through from a front page.
It doesn’t seem that great an opportunity as they’ve randomly selected 10,000 people out of 7.5 million adults. It then looks like you have to come to a consensus with the 50 participants otherwise the money goes back to her.
I found the Global Skills Partnerships from CGD interesting but I don’t know how active it still is/if you can fund it specifically.
As far as I know they weren’t funded by donated money, they received a grant from the S&F Fund and a smaller one from Open Phil (I don’t think either org take donations). The rest was self funded, more details in the original post.
I think it depends on how you define ‘narrow EA’, if you focus on getting 1% of the population to give effectively, that’s different to helping 100 people make impactful career switches but both could be defined as narrow in different ways.
One being narrow as it focuses on a small number of people, one being narrow as it spreads a subset of EA ideas.
Taking the Dutch Existential Risk Initiative example, it will be narrow in terms of cause focus but the strategy could still vary between focusing on top academics or a mass media campaign.
‘Narrow EA’ and having >1% of the population fitting the above description aren’t opposite strategies.
Maybe it’s similar to someone interested in animal welfare thinking alt protein coordination should focus on scientists, entrepreneurs, funders and policy makers but also thinking it would be good for there to be lots of people interested in veganism.
There are a lot of private sector community roles, some with salaries up to $180k—Here are some examples from a community manager job board.
It’s not necessarily that the “EA” jobs are more poorly paid, just that the people that take these roles could realistically earn much more elsewhere.
One way to think about it is that the aim of EA is to benefit the beneficiaries—the poorest people in the world, animals, future beings.
We should choose strategies that help the beneficiaries the most rather than strategies that help people that happen to be interested in EA (unless that also helps the beneficiaries—things like not burning out).
It makes sense to me that we should ask of those who have had the most privilege to give back the most, if you have more money you should give more of it away. If you have a stronger safety net and access to influence, you should use more of that to help others rather than helping yourself.
I think with the salaries, for most people, they could probably earn more in other sectors if they only cared about monetary gain rather than including impact in their career choice. If you’re coming from a charity/public service sector they may seem higher, if you’re coming from a private sector career they seem lower.
Looking at the grants database for 2023, there seems to be only 24 projects listed there for a total of ~$204k, which is less than 10% of the money said to be granted in 2023.
Including the 2022 Q4-2 tag, there are now 54 projects with grants totalling $1,170,000 (although this does include some of the examples above). I don’t know how many of these grants are included with the total sum given in the original post.
The ten largest grants were:-
$126k − 12-month part-time salary for 2 organisers, equipment and other expenses, to expand EA community building in Hong Kong
$114k − 8-month programme helping ambitious graduates to launch EU policy careers focused on emerging tech
$86k—Further develop the fast growing Dutch platform for effective giving until April 2023
$63k—Grant renewal of “A Happier World”: Salary and funding to continue producing video content
$62k − 12 months month salary for EA for Jews’ Managing Director
$57k—Yearly salary for weekly written summaries of the top EA and LW forum posts, and a human-narrated podcast for the former
$50k − 6 month salary and minor project expenses for career exploration, focused on biosecurity projects
$50k − 6 months of funding to scale our robo-advisor app for charitable giving
$50k—To grow the readership of a Substack on forecasting enough to fund it with reader donations while keeping content free
$45k − 1 year of 2.5 FTE salary split across 5 people to do community building work for EA Philippines + student chapters
I think this has been thought about a few times since EA started.
In 2015 Max Dalton wrote about medical research and said the below.
“GiveWell note that most funders of medical research more generally have large budgets, and claim that ‘It’s reasonable to ask how much value a new funder – even a relatively large one – can add in this context’. Whilst the field of tropical disease research is, as I argued above, more neglected, there are still a number of large foundations, and funding for several diseases is on the scale of hundreds of millions of dollars. Additionally, funding the development of a new drug may cost close to a billion dollars .
For these reasons, it is difficult to imagine a marginal dollar having any impact. However, as Macaskill argues at several points in Doing Good Better, this appears to only increase the riskiness of the donation, rather than reducing its expected impact.
In 2018 Peter Wildeford and Marcus A. Davis wrote about the cost effectiveness of vaccines and suggested that a malaria vaccine is competitive with other global health opportunities.
There are various posts about volunteering here.
I’ve linked some below that might be the most relevant.
Volunteering isn’t free
Effective Volunteering
What is a good answer for people new to EA that request advice on volunteering?
Why You Should Consider Skilled Volunteering
Also the $70 billion on development assistance for health doesn’t include other funding that contributes to development.
$100b+ on non health development
$500b+ remittances
Harder to estimate but over a trillion spent by LMICs on their own development and welfare
The Panorama episode briefly mentioned EA. Peter Singer spoke for a couple of minutes and EA was mainly viewed as charity that would be missing out on money. There seemed to be a lot more interest on the internal discussions within FTX, crypto drama, the politicians, celebrities etc.
Maybe Panorama is an outlier but potentially EA is not that interesting to most people or seemingly too complicated to explain if you only have an hour.
I’ve written a bit about this here and think that they would both be better off if they were more distinct.
As AI safety has grown over the last few years there may have been missed growth opportunities from not having a larger separated identity.
I spoke to someone at EAG London 2023 who didn’t realise that AI safety would get discussed at EAG until someone suggested they should go after doing an AI safety fellowship. There are probably many examples of people with an interest in emerging tech risks who would have got more involved at an earlier time if they’d been presented with those options at the beginning.
In 2015, one survey found 44% of the American public would consider AI an existential threat. In February 2023 it was 55%.
I’ve written about this idea before FTX and think that FTX is a minor influence compared to the increased interest in AI risk.
My original reasoning was that AI safety is a separate field but doesn’t really have much movement building work being put into it outside of EA/longtermism/x-risk framed activities.
Another reason why AI takes up a lot of EA space, is that there aren’t many other places to go to discuss these topics, which is bad for the growth of AI safety if it’s hidden behind donating 10% and going vegan and bad for EA if it gets overcrowded by something that should have it’s own institutions/events/etc.
Could the main difference be that TBP is a simple process change with reduced costs, while EA-style giving would fundamentally alter grant evaluation requiring more overhead from the funder.
I also think EA would involve extra costs to existing grantees, they will have to provide more evidence of their effectiveness or lose out to orgs that have those systems in place.
Separately I think it will be very hard to get existing foundations to shift to use more EA frameworks unless their main donors become interested in it. There is probably more to be gained by finding and helping the UHNW people/orgs that are inclined towards EA already.