First off, it is very hard to find a funding source that doesn’t create a conflict of interest. With ten megadonors in ten different fields, the conflicts would not be as acute but would be more broadly distributed. Government support brings conflicts. Relying on an army of small, mildly engaged donors creates “conflicts” of a different sort—there is a strong motivation to focus on what looks good and will play to a mildly-engaged donor base rather than what does good.
The obvious risk for conflict of interest is that the money impedes or distorts the movement’s message. It’s generally not a meaningful problem for a kidney-disease charity to have financial entanglement with a social media company; it would very much be a problem for the American Academy of Pediatrics. It seems relatively less likely that, applying the principles of EA, that being critical of the harm social media creates and/or advocating for systemic change that would specifically or disproportionately tank Meta/Asana stock would be priority cause areas.
It seems more likely to me that faithful application of EA principles would lead down a path that is contrary to in the interests of very wealthy donors more generally. But that is a hard problem to get around for a movement that wants to have great impact and needs loads of funding to do it.
I’m not that concerned about this.
First off, it is very hard to find a funding source that doesn’t create a conflict of interest. With ten megadonors in ten different fields, the conflicts would not be as acute but would be more broadly distributed. Government support brings conflicts. Relying on an army of small, mildly engaged donors creates “conflicts” of a different sort—there is a strong motivation to focus on what looks good and will play to a mildly-engaged donor base rather than what does good.
The obvious risk for conflict of interest is that the money impedes or distorts the movement’s message. It’s generally not a meaningful problem for a kidney-disease charity to have financial entanglement with a social media company; it would very much be a problem for the American Academy of Pediatrics. It seems relatively less likely that, applying the principles of EA, that being critical of the harm social media creates and/or advocating for systemic change that would specifically or disproportionately tank Meta/Asana stock would be priority cause areas.
It seems more likely to me that faithful application of EA principles would lead down a path that is contrary to in the interests of very wealthy donors more generally. But that is a hard problem to get around for a movement that wants to have great impact and needs loads of funding to do it.