I think there’s a kernel of truth to this suggestion! I would put it this way: EA should be global, and it should continue to be powered by communities, but those communities should be local and small.
First, work in specific cause areas should continue to happen globally, but should not operate with an automatic assumption of trust.
“Low trust” wouldn’t mean we stop doing a lot of good; it would just mean that we need to be more transparent and rigorous, rather than just having major EA figures texting with billionaires and the rest of us just hoping they do it right.
GiveWell is a great example of an EA institution built in a “low trust” framework. And it’s great! It would be very hard for me to imagine anything nearly this bad happening in the global health and well-being side of EA precisely because of places like GiveWell.
But small, high-trust community is also great! And we should encourage that — more local meetup groups, more university groups. I agree that funding for such things, after the “startup phase,” should also be a bit more locally-sourced, with alumni funding university groups since students usually lack sources of income.
When I first hosted some EAs for a dinner in my home, someone in the group asked if I wanted funding for it. I had enough context in the EA community that I knew where this sentiment was coming from, but I still found it weird to imagine the tendrils of central EA funds reaching all the way down into my little house dinner. And given that the source was probably, at least counterfactually/fungibly, FTX, I’m glad I didn’t take it.
I think there’s a kernel of truth to this suggestion! I would put it this way: EA should be global, and it should continue to be powered by communities, but those communities should be local and small.
First, work in specific cause areas should continue to happen globally, but should not operate with an automatic assumption of trust.
“Low trust” wouldn’t mean we stop doing a lot of good; it would just mean that we need to be more transparent and rigorous, rather than just having major EA figures texting with billionaires and the rest of us just hoping they do it right.
GiveWell is a great example of an EA institution built in a “low trust” framework. And it’s great! It would be very hard for me to imagine anything nearly this bad happening in the global health and well-being side of EA precisely because of places like GiveWell.
But small, high-trust community is also great! And we should encourage that — more local meetup groups, more university groups. I agree that funding for such things, after the “startup phase,” should also be a bit more locally-sourced, with alumni funding university groups since students usually lack sources of income.
When I first hosted some EAs for a dinner in my home, someone in the group asked if I wanted funding for it. I had enough context in the EA community that I knew where this sentiment was coming from, but I still found it weird to imagine the tendrils of central EA funds reaching all the way down into my little house dinner. And given that the source was probably, at least counterfactually/fungibly, FTX, I’m glad I didn’t take it.