You’re clearly pointing at a real problem, and the only case in which I can read this as melodramatic is the case in which the problem is already very serious. So, thank you for writing.
When the word “care” is used carelessly, or, more generally, when the emotional content of messages is not carefully tended to, this nudges EA towards being the sort of place where e.g. the word “care” is used carelessly. This has all sorts of hard to track negative effects; the sort of people who are irked by things like misuse of the word “care” are disproportionately likely to be the sort of people who are careful about this sort of thing themselves. It’s easy to see how a harmful “positive” feedback loop might be created in such a scenario if not paying attention to the connotations of words can drive our friends away.
You’re clearly pointing at a real problem, and the only case in which I can read this as melodramatic is the case in which the problem is already very serious. So, thank you for writing.
When the word “care” is used carelessly, or, more generally, when the emotional content of messages is not carefully tended to, this nudges EA towards being the sort of place where e.g. the word “care” is used carelessly. This has all sorts of hard to track negative effects; the sort of people who are irked by things like misuse of the word “care” are disproportionately likely to be the sort of people who are careful about this sort of thing themselves. It’s easy to see how a harmful “positive” feedback loop might be created in such a scenario if not paying attention to the connotations of words can drive our friends away.