I agree that increasing compensation to a happy price might be better than relying on altruism, and that selecting for altruistic individuals might mean selecting for those with high opportunity costs.
However, I don’t like the language or sentiment behind calling non-EAs “normies” especially in a context like this. I think both the nomenclature and the blanket sentiment is bad epistemics, bad for the EA brand, and potential evidence of a problematic worldview.
I agree that increasing compensation to a happy price might be better than relying on altruism, and that selecting for altruistic individuals might mean selecting for those with high opportunity costs.
However, I don’t like the language or sentiment behind calling non-EAs “normies” especially in a context like this. I think both the nomenclature and the blanket sentiment is bad epistemics, bad for the EA brand, and potential evidence of a problematic worldview.
Could you please elaborate?