This isnât a direct update (I think something along those lines would be useful) but the most up-to-date things in terms of funding might be:
EA Funding Spreadsheet thatâs been floating around updated with 2023 estimates. This shows convincingly, to me, that the heart of EA funding is in global health and development for the benefit of people living now, and not in longtermist/âAI risk positions[1]
GWWC donationsâbetween 2020 and 2022 about ~two thirds of donations went to Global Health and Animal Welfare
In terms of community opinions, I think the latest EA Survey is probably the best place to look. But as itâs from 2022, and weâve just come off EAâs worst ever year, I think a lot will have changed in terms of community opinion, and some people will understandably have walked away.
Reading the OP for the first time in 2024 is interesting. Taking the opinions of the Leaderâs Forum to cause area and using that to anchor the âidealâ allocation between cause areas.⌠hasnât really aged well, letâs just say that.
Iâm not actually taking a stand on the normative question. Itâs just bemusing to me that so many EA critics go for the âthe money used to go to bed nets, now itâs all goes to AI Safety Researchâ critique despite the evidence pointing out this isnât true
I think itâs probably a topic for itâs own post/âdialogue I guess, but I think that the last two years (post ~FTX and fallout and the public beating that EA has suffered and is suffering) that âEA Leadershipâ broadly defined has lost a lot of trust, and the right to be deferred to. I think arguments for decentralisation/âdemocratisation ala Cremer look stronger with each passing month. Another framing might be that, with MacAskill having to take a step back post FTX until legal matters are more finalised (I assume, please correct me if wrong), that nobody holds the EA âmandate of heavenâ.[1]
It also especially odd to me that Ben takes >50% of resources (defined as money and people) going towards Longtermism as the lodestar to aim for, instead of âhmm isnât it weird that this doesnât match EA funding patterns at all?â, like revealed preferences show a very different picture of what EAs value, see the GWWC donations above or the Donation Election results. And the CURVE sequence seems to be one of the few places where we actually get concrete cost effectiveness numbers for longtermist interventions, looking back Iâm not sure how much holds up to scrutiny.[2]
I also have an emotional, personal response that in the aftermath to EAâs annus horribilis that a lot of the âEA Leadershipâ (which I know is a vague term) has been conspicuous by its absence and not stepping up to take responsibility or provide guidance when times start to get tough, and instead direct the blame toward the âEA Communityâ (also vaguely defined).[3] But again, thatâs just an emotional feeling, I donât have references for it to hand, and it definitely colours my perspective on this whole thing.
At least from a âthis is the top impartial priorityâ perspective. I think from a âexploring underrated/âunknown ideasâ perspective it looks very good, but thatâs not what the Leaders were asked in this survey
I thought I recalled a twitter thread from Ben where he was talking about being separate from the EA Community as a good thing, and that most of his friends werenât EAs, but I couldnât find it, so maybe I just imagined it or confused it with someone else?
This isnât a direct update (I think something along those lines would be useful) but the most up-to-date things in terms of funding might be:
EA Funding Spreadsheet thatâs been floating around updated with 2023 estimates. This shows convincingly, to me, that the heart of EA funding is in global health and development for the benefit of people living now, and not in longtermist/âAI risk positions[1]
GWWC donationsâbetween 2020 and 2022 about ~two thirds of donations went to Global Health and Animal Welfare
In terms of community opinions, I think the latest EA Survey is probably the best place to look. But as itâs from 2022, and weâve just come off EAâs worst ever year, I think a lot will have changed in terms of community opinion, and some people will understandably have walked away.
Reading the OP for the first time in 2024 is interesting. Taking the opinions of the Leaderâs Forum to cause area and using that to anchor the âidealâ allocation between cause areas.⌠hasnât really aged well, letâs just say that.
Iâm not actually taking a stand on the normative question. Itâs just bemusing to me that so many EA critics go for the âthe money used to go to bed nets, now itâs all goes to AI Safety Researchâ critique despite the evidence pointing out this isnât true
What do you mean by the last point/âthat it hasnât aged well?
I think itâs probably a topic for itâs own post/âdialogue I guess, but I think that the last two years (post ~FTX and fallout and the public beating that EA has suffered and is suffering) that âEA Leadershipâ broadly defined has lost a lot of trust, and the right to be deferred to. I think arguments for decentralisation/âdemocratisation ala Cremer look stronger with each passing month. Another framing might be that, with MacAskill having to take a step back post FTX until legal matters are more finalised (I assume, please correct me if wrong), that nobody holds the EA âmandate of heavenâ.[1]
It also especially odd to me that Ben takes >50% of resources (defined as money and people) going towards Longtermism as the lodestar to aim for, instead of âhmm isnât it weird that this doesnât match EA funding patterns at all?â, like revealed preferences show a very different picture of what EAs value, see the GWWC donations above or the Donation Election results. And the CURVE sequence seems to be one of the few places where we actually get concrete cost effectiveness numbers for longtermist interventions, looking back Iâm not sure how much holds up to scrutiny.[2]
I also have an emotional, personal response that in the aftermath to EAâs annus horribilis that a lot of the âEA Leadershipâ (which I know is a vague term) has been conspicuous by its absence and not stepping up to take responsibility or provide guidance when times start to get tough, and instead direct the blame toward the âEA Communityâ (also vaguely defined).[3] But again, thatâs just an emotional feeling, I donât have references for it to hand, and it definitely colours my perspective on this whole thing.
This is an idea I want to explore in a post/âdiscussion. If anyone wants to collaborate let me know.
At least from a âthis is the top impartial priorityâ perspective. I think from a âexploring underrated/âunknown ideasâ perspective it looks very good, but thatâs not what the Leaders were asked in this survey
I thought I recalled a twitter thread from Ben where he was talking about being separate from the EA Community as a good thing, and that most of his friends werenât EAs, but I couldnât find it, so maybe I just imagined it or confused it with someone else?