Ah. In one sense, a core part of rationality is indeed rejecting beliefs you can’t justify. Similarly, a core part of EA is thinking carefully about your impact. However, I think one claim you could make here is that naively, intensely optimising these things will not actually win (e.g. lead to the formation of accurate beliefs; save the world). Specifically:
Rationality: often a deep integration with your feelings is required to form accurate beliefs—paying attention to a note of confusion, or something you can’t explain in rational terms yet. Indeed, sometimes it is harmful to impose “rationality” constraints too early, because you will tend to lose the information that can’t immediately comply with those constraints. Another example is defying social norms because one cannot justify them, only to later realise that they served some important function.
Sure, but that isn’t what the quoted text is saying. Trusting your gut or following social norms are not even on the same level as woo, or adopting beliefs with no justification.
If the harmful social norms Sasha actually had in mind were not trusting your gut & violating social norms with no gain, then I’d agree these actions are bad, and possibly a result of social norms in the rationality community. Another alternative is that the community’s made up of a bunch of socially awkward nerds, who are known for their social ineptness and inability to trust their gut.
But as it stands, this doesn’t seem to be what’s being argued, as the quoted text is tangential to what you said at best.
Ah. In one sense, a core part of rationality is indeed rejecting beliefs you can’t justify. Similarly, a core part of EA is thinking carefully about your impact. However, I think one claim you could make here is that naively, intensely optimising these things will not actually win (e.g. lead to the formation of accurate beliefs; save the world). Specifically:
Rationality: often a deep integration with your feelings is required to form accurate beliefs—paying attention to a note of confusion, or something you can’t explain in rational terms yet. Indeed, sometimes it is harmful to impose “rationality” constraints too early, because you will tend to lose the information that can’t immediately comply with those constraints. Another example is defying social norms because one cannot justify them, only to later realise that they served some important function.
EA: Burnout; depression.
Sure, but that isn’t what the quoted text is saying. Trusting your gut or following social norms are not even on the same level as woo, or adopting beliefs with no justification.
If the harmful social norms Sasha actually had in mind were not trusting your gut & violating social norms with no gain, then I’d agree these actions are bad, and possibly a result of social norms in the rationality community. Another alternative is that the community’s made up of a bunch of socially awkward nerds, who are known for their social ineptness and inability to trust their gut.
But as it stands, this doesn’t seem to be what’s being argued, as the quoted text is tangential to what you said at best.