3) Critics (eg @CarlaZoeC @LukaKemp) warned that EA should decentralize funding so it doesn’t become a closed validation loop where the people in SBF’s inner circle get millions to promote his & their vision for EA while others don’t. But EA funding remained overcentralized
I think the FTX regranting program was the single biggest push to decentralize funding EA has ever seen, and it’s crazy to me that anyone could look at what FTX Foundation was doing and say that the key problem is that the funding decisions were getting more, rather than less, centralized. (I would be interested in hearing from those who had some insight into the program whether this seems incorrect or overstated.)
That said, first, I was a regrantor, so I am biased, and even aside from the tremendous damage caused by the foundation needing to back out and the possibility of clawbacks, the fact that at least some of the money which was being regranted was stolen makes the whole thing completely unacceptable. However, it was unacceptable in ways that have nothing to do with being overly centralized.
I think that these are good lessons learned, but regarding the last point, I want to highlight a comment by Oliver Habryka;
This seems really important, and while I’m not sure that politics is the mind-killer, I think that the forum and EA in general needs to be really, really careful about the community dynamics. I think that the principal problem pointed out by the recent “Bad Omens” post was peer pressure towards conformity in ways that lead to people acting like jerks, and I think that we’re seeing that play out here as well, but involving central people in EA orgs pushing the dynamics, rather than local EA groups. And that seems far more worrying.
So yes, I think there are lots of important lessons learned about politics, but those matter narrowly. And I think that the biggest risk of failing to tread carefully here isn’t about wasting money on political campaigns, it’s undermining the ability to make trustworthy claims far more generally. We need to do our best to exhibit epistemic standards that are not just better than anyone else in politics—a bar too low to be worth noticing, much less aiming for—but ones that actually should engender trust among both EAs, and the rapidly growing set of people who are watching. And because politics operates at high simulacra levels, I’m concerned that in our rush to focus on various legitimate concerns and lessons while “doing politics” at the object level, we aren’t learning those lessons.