4. Conflicts of Interests are unavoidable if you want to maximize impact as a grantmaker in EA.
a. In tension with the above point, the EA community is just really small, and the subcommunities of fields within it (AI safety, or forecasting, or local community building) are even smaller.
b. Having a CoI often correlates strongly with having enough local knowledge to make an informed decision, so (e.g) if the grantmakers with the most CoIs in a committee always recuse themselves, a) you make worse decisions, and b) the other grantmakers have to burn more time to be more caught up to speed with what you know.
c. I started off with a policy of recusing myself from even small CoIs. But these days, I mostly accord with (what I think is) the equilibrium: a) definite recusal for romantic relationships, b) very likely recusal for employment or housing relationships, c) probable recusal for close friends, d) disclosure but no self-recusal by default for other relationships.
d. To avoid issues with #3, I’ve also made a conscious effort to do more recusals in reverse: that is, I’ve made a conscious effort to avoid being too friendly with grantees.
It makes sense to me what they were saying, even though it’s troubling, because conflicts of interest are a problem.
It seems natural that more due diligence and organizational safeguards (like avoiding CoIs) will be some of the lessons learned from the FTX disaster. This could be an opportunity to prioritize such things more highly to improve EA’s worst-case preparedness (e.g. corruption) , even at the expense of some impact/efficiency gains from having the best experts everywhere when things are going well.
There was a post Linch did earlier this year about their experience as a junior grantmaker in EA. It had an interesting part about conflicts of interest:
It makes sense to me what they were saying, even though it’s troubling, because conflicts of interest are a problem.
It seems natural that more due diligence and organizational safeguards (like avoiding CoIs) will be some of the lessons learned from the FTX disaster. This could be an opportunity to prioritize such things more highly to improve EA’s worst-case preparedness (e.g. corruption) , even at the expense of some impact/efficiency gains from having the best experts everywhere when things are going well.