The more time someone spends talking to a variety of community members (and potential future members), the more likely they are to have an accurate view of which norms will best encourage the community’s health and flourishing.
Correctness is not a popularity contest, it feels like this is an intellectual laundering of groupthink. Also, if you promote a particular view, that *changes* who is going to be a member of the community in the future, as well as who is excluded.
For example, the EA community has decided to exclude Robin Hanson and be more inclusive towards Slate journalists and people who like the opinions of Slate; this defines a future direction for the movement, rather than causing a fixed movement to either flourish or not.
This isn’t at all what I was trying to say. Let me try to restate my point:
“If you want to have an accurate view of people say will help them flourish in the community, you’re more likely to achieve that by talking to a lot of people in the community.”
Of course, what people claim will help them flourish may not actually help them flourish, but barring strong evidence to the contrary, it seems reasonable to assume some correlation. If members of a community differ on what they say will help them flourish, it seems reasonable to try setting norms that help as many community members as possible (though you might also adjust for factors like members’ expected impact, as when 80,000 Hours chooses a small group of people to advise closely).
*****
Whether EA Munich decides to host a Robin Hanson talk hardly qualifies as “the EA community deciding to exclude Robin Hanson and being more inclusive towards Slate journalists,” save in the sense that what eight people in one EA group do is a tiny nudge in some direction for the community overall. In general, the EA community tends to treat journalists as a dangerous element, to be managed carefully if they are interacted with at all.
For example, the response to Scott Alexander’s near-doxxing (which drew much more attention than the Hanson incident) was swift, decisive, and near-unified in favor of protecting speech and unorthodox views from those who threatened them. To me, that feels much more representative of the spirit of EA than the actions of, again, a single group (who were widely criticized afterward, and didn’t get much public support).
If it’s only a tiny nudge, why are we talking about it?
Why is it important for a teacher to give a harsh detention to the first student who challenges their authority, or for countries to defend their borders strictly rather than let it slide if someone encroaches just a few kilometres?
An expectation is being set here. Worse, an expectation has been set that threats of protest are a legitimate way to influence decision-making in our community. You have ceded authority to Slate by obeying their smear-piece on Hanson. Hanson is one of our people, you left him hanging in favour of what Slate thought.
If it’s only a tiny nudge, why are we talking about it?
I’m talking about something I considered a tiny nudge because I thought that a lot of people, including people who are pretty influential in communities I care about it, either reacted uncharitably or treated the issue as a much larger deal than it was.
You have ceded authority to Slate by obeying their smear-piece on Hanson. Hanson is one of our people, you left him hanging in favour of what Slate thought.
To whom is “you” meant to refer? I don’t work on CEA’s community health team and I’ve never been in contact with EA Munich about any of this.
I also personally disagreed with their decision and (as I noted in the post) thought the Slate piece was really bad. But my disagreeing with them doesn’t mean I can’t try to think through different elements of the situation and see it through the eyes of the people who had to deal with it.
I think the issue here is attempting to unilaterally disarm in a culture war. If your attitude is “let’s through different elements of the situation and see it through the eyes of the people” , and their attitude is “let’s use the most effective memetic superweapons we have access to to destroy everyone we disagree with”, then you’re going to lose and they are going to win.
A stark conclusion of “you’re going to lose” seems like it’s updating too much on a small number of examples.
For every story we hear about someone being cancelled, how many times has such an attempt been unsuccessful (no story) or even led to mutual reconciliation and understanding between the parties (no story)? How many times have niceness, community, and civilization won out over opposing forces?
(I once talked to a professor of mine at Yale who was accused by a student of sharing racist material. It was a misunderstanding. She resolved it with a single brief email to the student, who was glad to have been heard and had no further concerns. No story.)
I’m also not sure what your recommendation is here. Is it “refuse to communicate with people who espouse beliefs of type X”? Is it “create a centralized set of rules for how EA groups invite speakers”?
Correctness is not a popularity contest, it feels like this is an intellectual laundering of groupthink. Also, if you promote a particular view, that *changes* who is going to be a member of the community in the future, as well as who is excluded.
For example, the EA community has decided to exclude Robin Hanson and be more inclusive towards Slate journalists and people who like the opinions of Slate; this defines a future direction for the movement, rather than causing a fixed movement to either flourish or not.
This isn’t at all what I was trying to say. Let me try to restate my point:
“If you want to have an accurate view of people say will help them flourish in the community, you’re more likely to achieve that by talking to a lot of people in the community.”
Of course, what people claim will help them flourish may not actually help them flourish, but barring strong evidence to the contrary, it seems reasonable to assume some correlation. If members of a community differ on what they say will help them flourish, it seems reasonable to try setting norms that help as many community members as possible (though you might also adjust for factors like members’ expected impact, as when 80,000 Hours chooses a small group of people to advise closely).
*****
Whether EA Munich decides to host a Robin Hanson talk hardly qualifies as “the EA community deciding to exclude Robin Hanson and being more inclusive towards Slate journalists,” save in the sense that what eight people in one EA group do is a tiny nudge in some direction for the community overall. In general, the EA community tends to treat journalists as a dangerous element, to be managed carefully if they are interacted with at all.
For example, the response to Scott Alexander’s near-doxxing (which drew much more attention than the Hanson incident) was swift, decisive, and near-unified in favor of protecting speech and unorthodox views from those who threatened them. To me, that feels much more representative of the spirit of EA than the actions of, again, a single group (who were widely criticized afterward, and didn’t get much public support).
If it’s only a tiny nudge, why are we talking about it?
Why is it important for a teacher to give a harsh detention to the first student who challenges their authority, or for countries to defend their borders strictly rather than let it slide if someone encroaches just a few kilometres?
An expectation is being set here. Worse, an expectation has been set that threats of protest are a legitimate way to influence decision-making in our community. You have ceded authority to Slate by obeying their smear-piece on Hanson. Hanson is one of our people, you left him hanging in favour of what Slate thought.
EA people are, IMO, being naïve.
I’m talking about something I considered a tiny nudge because I thought that a lot of people, including people who are pretty influential in communities I care about it, either reacted uncharitably or treated the issue as a much larger deal than it was.
To whom is “you” meant to refer? I don’t work on CEA’s community health team and I’ve never been in contact with EA Munich about any of this.
I also personally disagreed with their decision and (as I noted in the post) thought the Slate piece was really bad. But my disagreeing with them doesn’t mean I can’t try to think through different elements of the situation and see it through the eyes of the people who had to deal with it.
I think the issue here is attempting to unilaterally disarm in a culture war. If your attitude is “let’s through different elements of the situation and see it through the eyes of the people” , and their attitude is “let’s use the most effective memetic superweapons we have access to to destroy everyone we disagree with”, then you’re going to lose and they are going to win.
A stark conclusion of “you’re going to lose” seems like it’s updating too much on a small number of examples.
For every story we hear about someone being cancelled, how many times has such an attempt been unsuccessful (no story) or even led to mutual reconciliation and understanding between the parties (no story)? How many times have niceness, community, and civilization won out over opposing forces?
(I once talked to a professor of mine at Yale who was accused by a student of sharing racist material. It was a misunderstanding. She resolved it with a single brief email to the student, who was glad to have been heard and had no further concerns. No story.)
I’m also not sure what your recommendation is here. Is it “refuse to communicate with people who espouse beliefs of type X”? Is it “create a centralized set of rules for how EA groups invite speakers”?