Saw this post for the first time after it was linked from one of the recent FTX posts, and wanted to say thank you for having taken the time to write and express these concerns, which clearly weren’t very popular but turned out to be...prescient. I’m a bit frustrated this didn’t get more karma or engagement at the time.
I’m also frustrated that I probably just scrolled past without clicking or considering because it didn’t have that much karma and seemed ‘against the mood.’ It feels important for everyone (like me) who was caught off guard this week to recognize that this was not, actually, unforeseeable. It’s humbling to realize how much work our cognitive biases must have been doing . Anyway, thanks!
“It is also similarly the case that EA’s should not support policy groups without clear rationale, express aims and an understanding that sponsorship can come with the reasonable assumption from general public, journalists, or future or current members, that EA is endorsing particular political views.”
This doesn’t seem right to me—I think anyone who understands EA should explicitly expect more consequentialist grant-makers to be willing to support groups whose political beliefs they might strongly disagree with if they also thought the group was going to take useful action with their funding.
As an observer, I would assume EA funders are just thinking through who has [leverage, influence, is positioned to act in the space, etc.] and putting aside any distaste they might feel for the group’s politics more readily than non-EA funders (e.g. the CJR program also funded conservative groups working on CJR whose views the program director presumably didn’t like or agree with for similar reasons).
“Other mission statements are politically motivated to a degree which is simply unacceptable for a group receiving major funds from an EA org.”
This seems to imply that EA funders should only give funding to groups that pass a certain epistemic purity test or are untouched by political considerations. I think applying EA-like epistemic standards to movement organizations in the US that touch on ~anything political would probably preclude you from funding anything political at all (maybe you’re arguing EA should therefore never fund anything that touches on politics, but that seems likely to be leaving a lot of impact on the table if taken to an extreme).
My guess is that if you looked at grantees in many other OP cause areas, you would see a large spread of opinions expressed by the grantees, many of which don’t follow EA-like epistemic norms. E.g. I understand the FAW grant-making team supports a range of groups who hold views on animal welfare, some of which are ideologically far afield from the internally stated goals of the program. Again, I don’t assume that the OP FAW POs necessarily endorse their views -- I assume they are being funded because the PO believes that those groups are going to do work that is effective, or productively contribute to the FAW movement ecosystem overall (e.g. by playing bad cop to another organization’s good cop with industry).