Ah. In one sense, a core part of rationality is indeed rejecting beliefs you can’t justify. Similarly, a core part of EA is thinking carefully about your impact. However, I think one claim you could make here is that naively, intensely optimising these things will not actually win (e.g. lead to the formation of accurate beliefs; save the world). Specifically:
Rationality: often a deep integration with your feelings is required to form accurate beliefs—paying attention to a note of confusion, or something you can’t explain in rational terms yet. Indeed, sometimes it is harmful to impose “rationality” constraints too early, because you will tend to lose the information that can’t immediately comply with those constraints. Another example is defying social norms because one cannot justify them, only to later realise that they served some important function.
EA: Burnout; depression.
Nice. What you wrote accords with my experience. In my own personal case, my relationship to EA changed quite substantially—and in the way you describe—when I transitioned from very online to being within a community.