I work for CEA, but the following views are my own. I don’t have any plans to change Forum policy around which topics are permitted, discouraged, etc. This response is just my attempt to think through some considerations other EAs might want to make around this topic.
--
While we all have topics on which our emotions get the better of us, those who leave are likely to be overcome to a greater degree and on a wider variety of topics. This means that they will be less likely to be able to contribute productively by providing reasoned analysis. But further than this, they are more likely to contribute negatively by being dismissive, producing biased analysis or engaging in personal attacks.
I don’t really care how likely someone is to be “overcome” by their emotions during an EA discussion, aside from the way in which this makes them feel (I want people in EA, like people everywhere, to flourish).
Being “overcome” and being able to reason productively seem almost orthogonal in my experience; some of the most productive people I’ve met in EA (and some of the nicest!) tend to have unusually strong emotional reactions to certain topics. There are quite a few EA blogs that alternate between “this thing made me very angry/sad” and “here’s an incredibly sophisticated argument for doing X”. There’s some validity to trying to increase the net percentage of conversation that isn’t too emotionally inflected, but my preference would be to accommodate as many productive/devoted people as we can until it begins to trade off with discussion quality. I’ve seen no evidence that we’re hitting this trade-off to an extent that demands we become less accommodating.
(And of course, biased analysis and personal attacks can be handled when they arise, without our needing to worry about being too inclusive of people who are “more likely” to contribute those things.)
The people who leave are likely to be more ideological. This is generally an association between being more radical and more ideological, even though there are also people who are radical without being ideological. People who are more ideological are less able to update in the face of new evidence and are also less likely to be able to provide the kind of reasoned analysis that would cause other EAs to update more towards their views.
See the previous point. I don’t mind having ideological people in EA if they share the community’s core values. If their commitment to an ideology leads them to stop upholding those values, we can respond to that separately. If they can provide reasoned analysis on Subject A while remaining incorrigibly biased on Subject B, I’ll gladly update on the former and ignore the latter. (Steven Pinker disagrees with many EAs quite sharply on X-risk, but most of his last book was great!)
We can accomodate people who have low levels of emotional control (this is distinct from feeling strong emotions) and who are more ideological. However, while it makes the group more welcoming for those individuals, it makes it less welcoming for everyone else, so it’s not so clear that this results in the group being more welcoming overall like we were promised. In any case, it helps highlight how narrow the particular conception of inclusion put forward by Making Discussions Inclusive actually is.
1. I’d really recommend finding a different phrase than “low levels of emotional control”, which is both more insulting than seems ideal for conversations in an EA context and too vague to be a useful descriptor. (There are dozens of ways that “controlling one’s emotions” might be important within EA, and almost no one is “high” or “low” for all of them.)
2. “Less welcoming for everyone else” is too broad. Accommodating people who prefer some topics not be brought up certainly makes EA less welcoming for some people: Competing access needs are real, and a lot of people aren’t as comfortable with discussions where emotions aren’t as controlled, or where topics are somewhat limited.
But having “high emotional control” (again, I’d prefer a different term) doesn’t necessarily mean feeling unwelcome in discussions with people who are ideological or “less controlled” in some contexts.
One of the features I like most in a community is “people try to handle social interaction in a way that has the best average result for everyone”.
I’d consider “we figure out true things” to be the most important factor we should optimize for, and our discussions should aim for “figuring stuff out”. But that’s not the only important result; another factor is “we all get along and treat each other well”, because there’s value in EA being a well-functioning community of people who are happy to be around each other. If having a topic consistently come up in conversation is draining and isolating to some members of the community, I think it’s reasonable that we have a higher bar for that topic.
This doesn’t mean abandoning global poverty because people think it seems colonialist; it might mean deciding that someone’s Mormon manifesto doesn’t pass the bar for “deserves careful, point-by-point discussion”. That isn’t very inclusive to the manifesto’s author, but it seems very likely to increase EA’s overall inclusiveness.
This is one of those circumstances where changing the phrase would likely mean avoiding the issue. I agree that we don’t want people to be unfeeling automatons and that there are circumstances when expressing even “negative” emotions like anger can be positive. At the same time, the idea that different people have different levels of emotional control seems to be a very useful model, even if it doesn’t perfectly describe reality (ie. context-dependence). You’ve already noted that some behaviours put a burden on most people—having low levels of emotional control/being ideological falls inside this category.
I’ll note one argument that you could have put forward: possibly low levels of emotional control is correlated with positive characteristics, such as creativity or the ability to be enthusiastic or authentic. So perhaps a filter on this quality would be net negative.
I’m not sure what you mean by ‘low emotional control.’ Are you talking about people who can’t control their reactions, or who can but find it tiring, or who can but choose not to?
I’m very emotional, but if someone’s rude to me in the context of a government negotiation, no one would be able to tell I even heard the insult (depending—in some contexts it’s strategic to assert yourself and set boundaries).
If someone’s rude to me in a social context, though, they’re going to get an earful! I don’t get paid to take your crap, so when someone insults me, either they’re going to hear about it or I’m going to leave.
So… Is that a low level of emotional control, or a high level of emotional control? What exactly are you referring to?
I work for CEA, but the following views are my own. I don’t have any plans to change Forum policy around which topics are permitted, discouraged, etc. This response is just my attempt to think through some considerations other EAs might want to make around this topic.
--
I don’t really care how likely someone is to be “overcome” by their emotions during an EA discussion, aside from the way in which this makes them feel (I want people in EA, like people everywhere, to flourish).
Being “overcome” and being able to reason productively seem almost orthogonal in my experience; some of the most productive people I’ve met in EA (and some of the nicest!) tend to have unusually strong emotional reactions to certain topics. There are quite a few EA blogs that alternate between “this thing made me very angry/sad” and “here’s an incredibly sophisticated argument for doing X”. There’s some validity to trying to increase the net percentage of conversation that isn’t too emotionally inflected, but my preference would be to accommodate as many productive/devoted people as we can until it begins to trade off with discussion quality. I’ve seen no evidence that we’re hitting this trade-off to an extent that demands we become less accommodating.
(And of course, biased analysis and personal attacks can be handled when they arise, without our needing to worry about being too inclusive of people who are “more likely” to contribute those things.)
See the previous point. I don’t mind having ideological people in EA if they share the community’s core values. If their commitment to an ideology leads them to stop upholding those values, we can respond to that separately. If they can provide reasoned analysis on Subject A while remaining incorrigibly biased on Subject B, I’ll gladly update on the former and ignore the latter. (Steven Pinker disagrees with many EAs quite sharply on X-risk, but most of his last book was great!)
We can accomodate people who have low levels of emotional control (this is distinct from feeling strong emotions) and who are more ideological. However, while it makes the group more welcoming for those individuals, it makes it less welcoming for everyone else, so it’s not so clear that this results in the group being more welcoming overall like we were promised. In any case, it helps highlight how narrow the particular conception of inclusion put forward by Making Discussions Inclusive actually is.
1. I’d really recommend finding a different phrase than “low levels of emotional control”, which is both more insulting than seems ideal for conversations in an EA context and too vague to be a useful descriptor. (There are dozens of ways that “controlling one’s emotions” might be important within EA, and almost no one is “high” or “low” for all of them.)
2. “Less welcoming for everyone else” is too broad. Accommodating people who prefer some topics not be brought up certainly makes EA less welcoming for some people: Competing access needs are real, and a lot of people aren’t as comfortable with discussions where emotions aren’t as controlled, or where topics are somewhat limited.
But having “high emotional control” (again, I’d prefer a different term) doesn’t necessarily mean feeling unwelcome in discussions with people who are ideological or “less controlled” in some contexts.
One of the features I like most in a community is “people try to handle social interaction in a way that has the best average result for everyone”.
I’d consider “we figure out true things” to be the most important factor we should optimize for, and our discussions should aim for “figuring stuff out”. But that’s not the only important result; another factor is “we all get along and treat each other well”, because there’s value in EA being a well-functioning community of people who are happy to be around each other. If having a topic consistently come up in conversation is draining and isolating to some members of the community, I think it’s reasonable that we have a higher bar for that topic.
This doesn’t mean abandoning global poverty because people think it seems colonialist; it might mean deciding that someone’s Mormon manifesto doesn’t pass the bar for “deserves careful, point-by-point discussion”. That isn’t very inclusive to the manifesto’s author, but it seems very likely to increase EA’s overall inclusiveness.
This is one of those circumstances where changing the phrase would likely mean avoiding the issue. I agree that we don’t want people to be unfeeling automatons and that there are circumstances when expressing even “negative” emotions like anger can be positive. At the same time, the idea that different people have different levels of emotional control seems to be a very useful model, even if it doesn’t perfectly describe reality (ie. context-dependence). You’ve already noted that some behaviours put a burden on most people—having low levels of emotional control/being ideological falls inside this category.
I’ll note one argument that you could have put forward: possibly low levels of emotional control is correlated with positive characteristics, such as creativity or the ability to be enthusiastic or authentic. So perhaps a filter on this quality would be net negative.
I’m not sure what you mean by ‘low emotional control.’ Are you talking about people who can’t control their reactions, or who can but find it tiring, or who can but choose not to?
I’m very emotional, but if someone’s rude to me in the context of a government negotiation, no one would be able to tell I even heard the insult (depending—in some contexts it’s strategic to assert yourself and set boundaries).
If someone’s rude to me in a social context, though, they’re going to get an earful! I don’t get paid to take your crap, so when someone insults me, either they’re going to hear about it or I’m going to leave.
So… Is that a low level of emotional control, or a high level of emotional control? What exactly are you referring to?