Can you clarify why you think your three criteria are enough to ascribe benign intentions the majority of the time? The point I was trying to get at was that there’s no relation to thinking a lot about how to make the world a better place and making sacrifices to achieve that AND also having benign intentions towards other groups. People can just more narrowly define the world that they are serving.
A concrete example of how believing women have less worth than men could be harmful in evaluating charities; one charity helps women by X utils, one charity helps men by X utils. (Perhaps charity #1 decreases the amount of work women need to do by having a well for water; etc.). Believing women have less worth than men would lead to charity #2 strictly dominating charity #1 when they should AC tually be equally recommended.
In terms of people having the ‘right’ philosophy — what I’m saying is that there’s nothing inherent to EA that prevents it from coexisting with misogyny. It’s not a core EA belief that women are equal to men. So we shouldn’t be surprised that EA’s may act as misogynists.
In any case, you admit that your criteria aren’t sufficient to screen out all negative intentions. When you say we give the benefit of the doubt for the sake of the EA project, what you’re saying is that demographic minorities need to accept some level of malevolence in their communities in exchange for the privilege of contributing to the EA cause. Why should the burden be on them? Why not place the burden (if you can even call it that) on individuals who don’t have to worry about this systematic malevolence — which is what this document suggests we do — to think about what they say before they say it.
(I’m not going to address each of your rebuttals individually because the main points I want to defend are the two I’ve tried to clarify above.)
Augur is a decentralized protocol using the blockchain which allows anyone to setup a prediction market about anything. Although I’m not sure about the legality, the fact that no one individual/institution owns or runs Augur suggests to me it might be easier to build niche/specific prediction markets on top of it.