I think the point I’m trying to make is that you need to adapt your language and norms for the audience you are talking to, which in the case of EA will often be people who are non-rationalist or have never even heard of rationalism.
If you go talking to an expert in nuclear policy and start talking about “inferential distances” and linking lesswrong blogposts to them, you are impeding understanding and communication, not increasing it. Your language may be more precise and accurate for someone else in your subculture, but for people outside it, it can be confusing and alienating.
Of course people in the EA forum can read and understand your sentence. But the extra length impedes readability and communication. I don’t think the extra things you signal with it add enough to overcome that. It’s not super bad or anything, but the tendency for unclear and overly verbose language is a clear problem I see when rationalists communicate in other forums.
I think the point I’m trying to make is that you need to adapt your language and norms for the audience you are talking to, which in the case of EA will often be people who are non-rationalist or have never even heard of rationalism.
If you go talking to an expert in nuclear policy and start talking about “inferential distances” and linking lesswrong blogposts to them, you are impeding understanding and communication, not increasing it. Your language may be more precise and accurate for someone else in your subculture, but for people outside it, it can be confusing and alienating.
Of course people in the EA forum can read and understand your sentence. But the extra length impedes readability and communication. I don’t think the extra things you signal with it add enough to overcome that. It’s not super bad or anything, but the tendency for unclear and overly verbose language is a clear problem I see when rationalists communicate in other forums.