I think youâre unintentionally dodging both Aaronâs and Benâs points here, by focusing on the generic idea of intellectual diversity and ignoring the specifics of this case. It simply isnât the case that disagreeing about *anything* can get you no-platformed/âcancelled/âwhatever. Nobody seeks 100% agreement with every speaker at an event; for one thing that sounds like a very dull event to attend! But there are specific areas people are particularly sensitive to, this is one of them, and Aaron gave a stylised example of the kind of person we can lose here immediately after the section you quoted. It really doesnât sound like what youâre talking about.
> A German survivor of sexual abuse is interested in EA Munichâs events. They see a talk with Robin Hanson and Google him to see whether they want to attend. They stumble across his work on âgentle silent rapeâ and find it viscerally unpleasant. Theyâve seen other discussion spaces where ideas like Hansonâs were brought up and found them really unpleasant to spend time in. They leave the EA Munich Facebook group and decide not to engage with the EA community any more.
Like Ben, I understand you as either saying that this person is sufficiently uncommon that their loss is worth the more-valuable conversation, or that we donât care about someone who would distance themselves from EA for this reason anyway (itâs not an actual âlossâ). And Iâm not sure which it is or (if the first) what percentages you would give.
The thing that I am saying is that in order to make space for someone who tries to enforce such norms, we would have to kick out many other people out of the community, and stop many others from joining. It is totally fine for people to not attend events if they just happen to hit on a topic that they are sensitive to, but for someone to completely disengage from a community and avoid talking to anyone in that community because a speaker at some event had some opinions that they were sensitive to, that wasnât even the topic of the announced talk, is obviously going to exert substantial pressure on what kind of discourse is possible with them.
This doesnât seem to fit nicely into the dichotomy you and Ben are proposing here, which just has two options:
1. They are uncommon
2. They are not valuable
I am proposing a third option which is:
3. They are common and potentially valuable on their own, but also they impose costs on others that outweigh the benefits of their participation, and that make it hard to build an intellectually diverse community out of people like that. And itâs really hard to integrate them into a discourse that might come to unintuitive conclusions if they systematically avoid engaging with any individuals that have expressed any ideas at some point in their public history that they are particularly sensitive to.
It seems to me that the right strategy to run if you are triggered by specific topics, is to simply avoid engaging with those topics (if you really have no way of overcoming your triggeredness, or if doing so is expensive), but it seems very rarely the right choice to avoid anyone who ever has said anything public about the topic that is triggering you! It seems obvious how that makes it hard for you to be part of an intellectually diverse community.
[EDIT: As Oliâs next reponse notes, Iâm misinterpreting him here. His claim is that the movement would be overall larger in a world where we lose this group but correspondingly pick up others (like Robin, I assume), or at least that the direction of the effect on movement size is not obvious.]
***
Thanks for the response. Contrary to your claim that you are proposing a third option, I think your (3) cleanly falls under mine and Benâs first option, since itâs just a non-numeric write-up of what Ben said:
Sure, we will lose 95% of the people we want to attract, but the resulting discussion will be >20x more valuable so itâs worth the cost
I assume you would give different percentages, like 30% and 2x or something, but the structure of your (3) appears identical.
***
At that point my disagreement with you on this specific case becomes pretty factual; the number of sexual abuse survivors is large, my expected percentage of them that donât want to engage with Robin Hanson is high, the number of people in the community with on-the-record statements or behaviour that are comparably or more unpleasant to those people is small, and so Iâm generally willing to distance from the latter in order to be open to the former. Thatâs from a purely cold-blooded âmaximise community outputâ perspective, never mind the human element.
Other than that, I have a number of disagremeents with things you wrote, and for brevity Iâm not going to go through them all; you may assume by default that everything you think is obvious I do not think is obvious. But the crux of the disagreement is here I think:
it seems very rarely the right choice to avoid anyone who ever has said anything public about the topic that is triggering you
I disagree with the non-hyperbolic version of this, and think it significantly underestimates the extent to which someone repeatedly saying or doing public things that you find odious is a predictor of them saying or doing unpleasant things to you in person, in a fairly straightforward âleopards donât change their spotsâ way.
I canât speak to the sexual abuse case directly, but if someone has a long history of making overtly racist statements Iâm not likely to attend a small-group event that I know they will attend, because I put high probability that they will act in an overtly racist way towards me and I really canât be bothered to deal with that. Iâm definitely not bringing my children to that event. Itâs not a matter of being âtriggeredâ per se, I just have better things to do with my evening than cutting some obnoxious racist down to size. But even then, Iâm very privileged in a number of ways and so very comfortable defending my corner and arguing back if attacked; not everybody has (or should have) the ability and/âor patience to do that.
Thereâs also a large second-order effect that communities which tolerate such behaviour are much more likely to contain other individuals who hold those views and merely havenât put them in writing on the internet, which increases the probability of such an experience considerably. Avoidance of such places is the right default policy here, at an individual level at least.
No. How does my (3) match up to that option? The thing I am saying is not that we will lose 95% of the people, the thing I am saying is we are going to lose a large fraction of people either way, and the world where you have tons of people who follow the strategy of distancing themselves from anyone who says things they donât like is a world where you both wonât have a lot of people, and you will have tons of polarization and internal conflict.
How is your summary at all compatible with what I said, given that I explicitly said:
with the second (the one where we select on tolerance) possibly actually being substantially larger
That by necessity means that I expect the strategy you are proposing to not result in a larger community, at least in the long run. We can have a separate conversation about the exact balance of tradeoffs here, but please recognize that I am not saying the thing you are summarizing me as saying.
I am specifically challenging the assumption that this is a tradeoff of movement size, using some really straightforward logic of âif you have lots people who have a propensity to distance themselves from others, they will distance themselves and things will splinter apartâ. You might doubt that such a general tendency exists, you might doubt that the inference here is valid and that there are ways to keep such a community of people together either way, but in either case, please donât claim that I am saying something I am pretty clearly not saying.
Thank you for explicitly saying that you think your proposed approach would lead to a larger movement size in the long run, I had missed that. Your actual self-quote is an extremely weak version of this, since âthis might possibly actually happenâ is not the same as explicitly saying âI think this will happenâ. The latter certainly does not follow from the former âby necessityâ.
Still, I could have reasonably inferred that you think the latter based on the rest of your commentary, and should at least have asked if that is in fact what you think, so I apologise for that and will edit my previous post to reflect the same.
That all said, I believe my previous post remains an adequate summary of why I disagree with you on the object level question.
Your actual self-quote is an extremely weak version of this, since âthis might possibly actually happenâ is not the same as explicitly saying âI think this will happenâ. The latter certainly does not follow from the former âby necessityâ.
Yeah, sorry, I do think the âby necessityâ was too strong.
I think youâre unintentionally dodging both Aaronâs and Benâs points here, by focusing on the generic idea of intellectual diversity and ignoring the specifics of this case. It simply isnât the case that disagreeing about *anything* can get you no-platformed/âcancelled/âwhatever. Nobody seeks 100% agreement with every speaker at an event; for one thing that sounds like a very dull event to attend! But there are specific areas people are particularly sensitive to, this is one of them, and Aaron gave a stylised example of the kind of person we can lose here immediately after the section you quoted. It really doesnât sound like what youâre talking about.
> A German survivor of sexual abuse is interested in EA Munichâs events. They see a talk with Robin Hanson and Google him to see whether they want to attend. They stumble across his work on âgentle silent rapeâ and find it viscerally unpleasant. Theyâve seen other discussion spaces where ideas like Hansonâs were brought up and found them really unpleasant to spend time in. They leave the EA Munich Facebook group and decide not to engage with the EA community any more.
Like Ben, I understand you as either saying that this person is sufficiently uncommon that their loss is worth the more-valuable conversation, or that we donât care about someone who would distance themselves from EA for this reason anyway (itâs not an actual âlossâ). And Iâm not sure which it is or (if the first) what percentages you would give.
The thing that I am saying is that in order to make space for someone who tries to enforce such norms, we would have to kick out many other people out of the community, and stop many others from joining. It is totally fine for people to not attend events if they just happen to hit on a topic that they are sensitive to, but for someone to completely disengage from a community and avoid talking to anyone in that community because a speaker at some event had some opinions that they were sensitive to, that wasnât even the topic of the announced talk, is obviously going to exert substantial pressure on what kind of discourse is possible with them.
This doesnât seem to fit nicely into the dichotomy you and Ben are proposing here, which just has two options:
1. They are uncommon
2. They are not valuable
I am proposing a third option which is:
3. They are common and potentially valuable on their own, but also they impose costs on others that outweigh the benefits of their participation, and that make it hard to build an intellectually diverse community out of people like that. And itâs really hard to integrate them into a discourse that might come to unintuitive conclusions if they systematically avoid engaging with any individuals that have expressed any ideas at some point in their public history that they are particularly sensitive to.
It seems to me that the right strategy to run if you are triggered by specific topics, is to simply avoid engaging with those topics (if you really have no way of overcoming your triggeredness, or if doing so is expensive), but it seems very rarely the right choice to avoid anyone who ever has said anything public about the topic that is triggering you! It seems obvious how that makes it hard for you to be part of an intellectually diverse community.
[EDIT: As Oliâs next reponse notes, Iâm misinterpreting him here. His claim is that the movement would be overall larger in a world where we lose this group but correspondingly pick up others (like Robin, I assume), or at least that the direction of the effect on movement size is not obvious.]
***
Thanks for the response. Contrary to your claim that you are proposing a third option, I think your (3) cleanly falls under mine and Benâs first option, since itâs just a non-numeric write-up of what Ben said:
I assume you would give different percentages, like 30% and 2x or something, but the structure of your (3) appears identical.
***
At that point my disagreement with you on this specific case becomes pretty factual; the number of sexual abuse survivors is large, my expected percentage of them that donât want to engage with Robin Hanson is high, the number of people in the community with on-the-record statements or behaviour that are comparably or more unpleasant to those people is small, and so Iâm generally willing to distance from the latter in order to be open to the former. Thatâs from a purely cold-blooded âmaximise community outputâ perspective, never mind the human element.
Other than that, I have a number of disagremeents with things you wrote, and for brevity Iâm not going to go through them all; you may assume by default that everything you think is obvious I do not think is obvious. But the crux of the disagreement is here I think:
I disagree with the non-hyperbolic version of this, and think it significantly underestimates the extent to which someone repeatedly saying or doing public things that you find odious is a predictor of them saying or doing unpleasant things to you in person, in a fairly straightforward âleopards donât change their spotsâ way.
I canât speak to the sexual abuse case directly, but if someone has a long history of making overtly racist statements Iâm not likely to attend a small-group event that I know they will attend, because I put high probability that they will act in an overtly racist way towards me and I really canât be bothered to deal with that. Iâm definitely not bringing my children to that event. Itâs not a matter of being âtriggeredâ per se, I just have better things to do with my evening than cutting some obnoxious racist down to size. But even then, Iâm very privileged in a number of ways and so very comfortable defending my corner and arguing back if attacked; not everybody has (or should have) the ability and/âor patience to do that.
Thereâs also a large second-order effect that communities which tolerate such behaviour are much more likely to contain other individuals who hold those views and merely havenât put them in writing on the internet, which increases the probability of such an experience considerably. Avoidance of such places is the right default policy here, at an individual level at least.
No. How does my (3) match up to that option? The thing I am saying is not that we will lose 95% of the people, the thing I am saying is we are going to lose a large fraction of people either way, and the world where you have tons of people who follow the strategy of distancing themselves from anyone who says things they donât like is a world where you both wonât have a lot of people, and you will have tons of polarization and internal conflict.
How is your summary at all compatible with what I said, given that I explicitly said:
That by necessity means that I expect the strategy you are proposing to not result in a larger community, at least in the long run. We can have a separate conversation about the exact balance of tradeoffs here, but please recognize that I am not saying the thing you are summarizing me as saying.
I am specifically challenging the assumption that this is a tradeoff of movement size, using some really straightforward logic of âif you have lots people who have a propensity to distance themselves from others, they will distance themselves and things will splinter apartâ. You might doubt that such a general tendency exists, you might doubt that the inference here is valid and that there are ways to keep such a community of people together either way, but in either case, please donât claim that I am saying something I am pretty clearly not saying.
Thank you for explicitly saying that you think your proposed approach would lead to a larger movement size in the long run, I had missed that. Your actual self-quote is an extremely weak version of this, since âthis might possibly actually happenâ is not the same as explicitly saying âI think this will happenâ. The latter certainly does not follow from the former âby necessityâ.
Still, I could have reasonably inferred that you think the latter based on the rest of your commentary, and should at least have asked if that is in fact what you think, so I apologise for that and will edit my previous post to reflect the same.
That all said, I believe my previous post remains an adequate summary of why I disagree with you on the object level question.
Yeah, sorry, I do think the âby necessityâ was too strong.