This isn’t a direct update (I think something along those lines would be useful) but the most up-to-date things in terms of funding might be:
EA Funding Spreadsheet that’s been floating around updated with 2023 estimates. This shows convincingly, to me, that the heart of EA funding is in global health and development for the benefit of people living now, and not in longtermist/AI risk positions[1]
GWWC donations—between 2020 and 2022 about ~two thirds of donations went to Global Health and Animal Welfare
In terms of community opinions, I think the latest EA Survey is probably the best place to look. But as it’s from 2022, and we’ve just come off EA’s worst ever year, I think a lot will have changed in terms of community opinion, and some people will understandably have walked away.
Reading the OP for the first time in 2024 is interesting. Taking the opinions of the Leader’s Forum to cause area and using that to anchor the ‘ideal’ allocation between cause areas.… hasn’t really aged well, let’s just say that.
I’m not actually taking a stand on the normative question. It’s just bemusing to me that so many EA critics go for the “the money used to go to bed nets, now it’s all goes to AI Safety Research” critique despite the evidence pointing out this isn’t true
I think it’s probably a topic for it’s own post/dialogue I guess, but I think that the last two years (post ~FTX and fallout and the public beating that EA has suffered and is suffering) that ‘EA Leadership’ broadly defined has lost a lot of trust, and the right to be deferred to. I think arguments for decentralisation/democratisation ala Cremer look stronger with each passing month. Another framing might be that, with MacAskill having to take a step back post FTX until legal matters are more finalised (I assume, please correct me if wrong), that nobody holds the EA ‘mandate of heaven’.[1]
It also especially odd to me that Ben takes >50% of resources (defined as money and people) going towards Longtermism as the lodestar to aim for, instead of “hmm isn’t it weird that this doesn’t match EA funding patterns at all?”, like revealed preferences show a very different picture of what EAs value, see the GWWC donations above or the Donation Election results. And the CURVE sequence seems to be one of the few places where we actually get concrete cost effectiveness numbers for longtermist interventions, looking back I’m not sure how much holds up to scrutiny.[2]
I also have an emotional, personal response that in the aftermath to EA’s annus horribilis that a lot of the ‘EA Leadership’ (which I know is a vague term) has been conspicuous by its absence and not stepping up to take responsibility or provide guidance when times start to get tough, and instead direct the blame toward the “EA Community” (also vaguely defined).[3] But again, that’s just an emotional feeling, I don’t have references for it to hand, and it definitely colours my perspective on this whole thing.
At least from a ‘this is the top impartial priority’ perspective. I think from a ‘exploring underrated/unknown ideas’ perspective it looks very good, but that’s not what the Leaders were asked in this survey
I thought I recalled a twitter thread from Ben where he was talking about being separate from the EA Community as a good thing, and that most of his friends weren’t EAs, but I couldn’t find it, so maybe I just imagined it or confused it with someone else?
Thanks a lot for this! Do you know if there is an updated version of this?
This isn’t a direct update (I think something along those lines would be useful) but the most up-to-date things in terms of funding might be:
EA Funding Spreadsheet that’s been floating around updated with 2023 estimates. This shows convincingly, to me, that the heart of EA funding is in global health and development for the benefit of people living now, and not in longtermist/AI risk positions[1]
GWWC donations—between 2020 and 2022 about ~two thirds of donations went to Global Health and Animal Welfare
In terms of community opinions, I think the latest EA Survey is probably the best place to look. But as it’s from 2022, and we’ve just come off EA’s worst ever year, I think a lot will have changed in terms of community opinion, and some people will understandably have walked away.
Reading the OP for the first time in 2024 is interesting. Taking the opinions of the Leader’s Forum to cause area and using that to anchor the ‘ideal’ allocation between cause areas.… hasn’t really aged well, let’s just say that.
I’m not actually taking a stand on the normative question. It’s just bemusing to me that so many EA critics go for the “the money used to go to bed nets, now it’s all goes to AI Safety Research” critique despite the evidence pointing out this isn’t true
What do you mean by the last point/that it hasn’t aged well?
I think it’s probably a topic for it’s own post/dialogue I guess, but I think that the last two years (post ~FTX and fallout and the public beating that EA has suffered and is suffering) that ‘EA Leadership’ broadly defined has lost a lot of trust, and the right to be deferred to. I think arguments for decentralisation/democratisation ala Cremer look stronger with each passing month. Another framing might be that, with MacAskill having to take a step back post FTX until legal matters are more finalised (I assume, please correct me if wrong), that nobody holds the EA ‘mandate of heaven’.[1]
It also especially odd to me that Ben takes >50% of resources (defined as money and people) going towards Longtermism as the lodestar to aim for, instead of “hmm isn’t it weird that this doesn’t match EA funding patterns at all?”, like revealed preferences show a very different picture of what EAs value, see the GWWC donations above or the Donation Election results. And the CURVE sequence seems to be one of the few places where we actually get concrete cost effectiveness numbers for longtermist interventions, looking back I’m not sure how much holds up to scrutiny.[2]
I also have an emotional, personal response that in the aftermath to EA’s annus horribilis that a lot of the ‘EA Leadership’ (which I know is a vague term) has been conspicuous by its absence and not stepping up to take responsibility or provide guidance when times start to get tough, and instead direct the blame toward the “EA Community” (also vaguely defined).[3] But again, that’s just an emotional feeling, I don’t have references for it to hand, and it definitely colours my perspective on this whole thing.
This is an idea I want to explore in a post/discussion. If anyone wants to collaborate let me know.
At least from a ‘this is the top impartial priority’ perspective. I think from a ‘exploring underrated/unknown ideas’ perspective it looks very good, but that’s not what the Leaders were asked in this survey
I thought I recalled a twitter thread from Ben where he was talking about being separate from the EA Community as a good thing, and that most of his friends weren’t EAs, but I couldn’t find it, so maybe I just imagined it or confused it with someone else?