Coherence may not even matter that much, I presume that one of Open Philanthropy’s goals in the worldview framework is to have neat buckets for potential donors to back depending on their own feelings. I also reckon that even if they don’t personally have incoherent beliefs, attracting the donations of those that do is probably more advantageous than rejecting them.
It’s fine to offer recommendations within suboptimal cause areas for ineffective donors. But I’m talking about worldview diversification for the purpose of allocating one’s own (or OpenPhil’s own) resources genuinely wisely, given one’s (or: OP’s) warranted uncertainty.
Coherence may not even matter that much, I presume that one of Open Philanthropy’s goals in the worldview framework is to have neat buckets for potential donors to back depending on their own feelings. I also reckon that even if they don’t personally have incoherent beliefs, attracting the donations of those that do is probably more advantageous than rejecting them.
It’s fine to offer recommendations within suboptimal cause areas for ineffective donors. But I’m talking about worldview diversification for the purpose of allocating one’s own (or OpenPhil’s own) resources genuinely wisely, given one’s (or: OP’s) warranted uncertainty.