To me the core tension here is: even if a direct impact sense pure capabilities work is one of the most harmful things you can do (something which I feel fairly uncertain about), it’s still also one of the most valuable things you can do, in an upskilling sense. So at least until the point where it’s (ballpark) as effective and accessible to upskill in alignment by doing alignment directly rather than by doing capabilites, I think current charitability norms are better than the ostracism norms you propose. (And even after that point, charitability may still be better for talent acquisition, although the tradeoffs are more salient.)
I think this might be reasonable under charitability vs ostracism dichotomy.
However I think we can probably do better. I run a crypto venture group and we take “founders pledge” type stuff very seriously. We want to make strong, specific commitments, before its time to act on them (specifically, all upside past 2M post-tax for any member has to go towards EA crap).
Furthermore, when we talk to people, we don’t really expect them (normatively speaking) to think we are aligned unless we emphasize these commitments. I would say we actively push the norm that we shouldnt receive charitability without track record.
I would really advocate for the same thing here, if anything it seems of greater importance.
That’s not to say it’s obvious what these commitments should be, since its more straightforward for making money.
My real point is that in normie land, charitability vs ostracism is the dichotomy. But I think in many cases EA already achieves more nuance, the norms demand proof of altruism in order to cash in on status.
Does that make sense? I think charitability is too strong of a norm and makes it too easy to be evil. I don’t even apply it to myself! Even if there are good reasons to do things that are indistinguishable from just being bad, that doesnt mean everyone should just get benefit of the doubt. I do think that specific pledges matter. The threat of conditional shunning matters.
To me the core tension here is: even if a direct impact sense pure capabilities work is one of the most harmful things you can do (something which I feel fairly uncertain about), it’s still also one of the most valuable things you can do, in an upskilling sense. So at least until the point where it’s (ballpark) as effective and accessible to upskill in alignment by doing alignment directly rather than by doing capabilites, I think current charitability norms are better than the ostracism norms you propose. (And even after that point, charitability may still be better for talent acquisition, although the tradeoffs are more salient.)
I think this might be reasonable under charitability vs ostracism dichotomy.
However I think we can probably do better. I run a crypto venture group and we take “founders pledge” type stuff very seriously. We want to make strong, specific commitments, before its time to act on them (specifically, all upside past 2M post-tax for any member has to go towards EA crap).
Furthermore, when we talk to people, we don’t really expect them (normatively speaking) to think we are aligned unless we emphasize these commitments. I would say we actively push the norm that we shouldnt receive charitability without track record.
I would really advocate for the same thing here, if anything it seems of greater importance.
That’s not to say it’s obvious what these commitments should be, since its more straightforward for making money.
My real point is that in normie land, charitability vs ostracism is the dichotomy. But I think in many cases EA already achieves more nuance, the norms demand proof of altruism in order to cash in on status.
Does that make sense? I think charitability is too strong of a norm and makes it too easy to be evil. I don’t even apply it to myself! Even if there are good reasons to do things that are indistinguishable from just being bad, that doesnt mean everyone should just get benefit of the doubt. I do think that specific pledges matter. The threat of conditional shunning matters.