I feel like many (most?) of the “-ist”-like descriptors that apply to me are dependent on empirical/refutable claims. For example, I’m also an atheist—but that view would potentially be quite easy to disprove with the right evidence.
Indeed, I think it’s just very common for people who hold a certain empirical view to be called an X-ist. Maybe that’s somewhat bad for their epistemics, but I think this piece goes too far in suggesting that I have to think something is “for-sure-correct” before “-ist” descriptors apply.
Separately, I think identifying as a rationalist and effective altruist is good for my epistemics. Part of being a good EA is having careful epistemics, updating based on evidence, being impartially compassionate, etc. Part of being a rationalist is to be aware of and willing to correct for my cognitive biases. When someone challenges me on one of these points, my professed identity gives me a strong incentive to behave well that I wouldn’t otherwise have. (To be fair, I’m less sure this applies to “longtermist”, which I think has much less pro-epistemic baggage than EA.)
I feel like many (most?) of the “-ist”-like descriptors that apply to me are dependent on empirical/refutable claims. For example, I’m also an atheist—but that view would potentially be quite easy to disprove with the right evidence.
Indeed, I think it’s just very common for people who hold a certain empirical view to be called an X-ist. Maybe that’s somewhat bad for their epistemics, but I think this piece goes too far in suggesting that I have to think something is “for-sure-correct” before “-ist” descriptors apply.
Separately, I think identifying as a rationalist and effective altruist is good for my epistemics. Part of being a good EA is having careful epistemics, updating based on evidence, being impartially compassionate, etc. Part of being a rationalist is to be aware of and willing to correct for my cognitive biases. When someone challenges me on one of these points, my professed identity gives me a strong incentive to behave well that I wouldn’t otherwise have. (To be fair, I’m less sure this applies to “longtermist”, which I think has much less pro-epistemic baggage than EA.)