This will be my last message in this thread, because I find this conversation upsetting every time it happens (and every time it becomes clear that nothing will change). I find it really distressing that a bunch of lovely and caring people can come together and create a community that can be so unfriendly to the victims of assault and harassment.
And I find it upsetting that these lovely and caring people can fall into serious moral failure, in the way that this is a serious moral failure from my perspective on morality (I say this while also accepting that this reflects not evilness but rather a disagreement about morality, such that the lovely, caring people really do continue to be lovely and caring and they simply disagree with me about a substantive question).
To reply to your specific comments, I certainly agree that there is room for nuance: situations can be unclear and there can be clashes of cultural norms. Navigating the moral world is difficult and we certainly need to pay attention to nuances to navigate it well.
Yet as far as I’m concerned, it remains the case that someone’s contributions via their work are irrelevant to assessing how we should respond to their serious wrongdoing. It’s possible to accept the existence of nuance without thinking that all nuances matter. I do not think that this nuance matters.
(I’m happy to stick to discussing serious cases of wrongdoing and simply set aside the more marginal cases. I think it would represent such a huge step forwards if EA could come to robustly act on serious wrongdoing, so I don’t want to get distracted by trying to figure out the appropriate reaction to the less crucial cases.)
I cannot provide an argument for this of the form that Oliver would like, not least because his comment suggests he might prefer an argument that is ultimately consequentialist in nature even if at some layers removed, but I think this is the fundamentally wrong approach.
Everyone accepts some moral claims as fundamental. I take it as a fundamental moral claim that when a perpetrator commits a serious wrong against someone it is the nature of the wrong (and perhaps the views of the person wronged, per Jenny’s comment) that determine the appropriate response. I don’t expect that everyone reading this comment will agree with this, and I don’t believe it’s always possible to argue someone into a moral view (I think at some fundamental level, we end up having to accept irreconcilable disagreements, as much as that frustrates the EA urge to be able to use reason to settle all matters).
(At this point, we could push into hypothetical scenarios like, “what if you were literally certain that if we reacted appropriately to the wrongdoing then everyone would be tortured forever?”. Would the consequences still be irrelevant? Perhaps not, but the fact of the matter is that we do not live in a hypothetical world. I will say this much: I think that the nature of the wrongdoing is the vastly dominating factor in determining how to respond to that wrongdoing. In realistic cases, it is powerful enough that we don’t need to reflect on the other considerations that carry less weight in this context.)
I’ve said I don’t expect to convince the consequentialists reading this to accept my view. What’s the point then? Perhaps I simply hope to make clear just how crucial an issue of moral conscience this is for some people. And perhaps I hope that this might at least push EA to consider a compromise that is more responsive to this matter of conscience.
I’m sorry you’ve found this conversation upsetting, and think it’s entirely reasonable to not want to continue it, so I’ll leave things here. I appreciate the openness, and you still being willing to express this opinion despite expecting to find the conversation upsetting!
This will be my last message in this thread, because I find this conversation upsetting every time it happens (and every time it becomes clear that nothing will change). I find it really distressing that a bunch of lovely and caring people can come together and create a community that can be so unfriendly to the victims of assault and harassment.
And I find it upsetting that these lovely and caring people can fall into serious moral failure, in the way that this is a serious moral failure from my perspective on morality (I say this while also accepting that this reflects not evilness but rather a disagreement about morality, such that the lovely, caring people really do continue to be lovely and caring and they simply disagree with me about a substantive question).
To reply to your specific comments, I certainly agree that there is room for nuance: situations can be unclear and there can be clashes of cultural norms. Navigating the moral world is difficult and we certainly need to pay attention to nuances to navigate it well.
Yet as far as I’m concerned, it remains the case that someone’s contributions via their work are irrelevant to assessing how we should respond to their serious wrongdoing. It’s possible to accept the existence of nuance without thinking that all nuances matter. I do not think that this nuance matters.
(I’m happy to stick to discussing serious cases of wrongdoing and simply set aside the more marginal cases. I think it would represent such a huge step forwards if EA could come to robustly act on serious wrongdoing, so I don’t want to get distracted by trying to figure out the appropriate reaction to the less crucial cases.)
I cannot provide an argument for this of the form that Oliver would like, not least because his comment suggests he might prefer an argument that is ultimately consequentialist in nature even if at some layers removed, but I think this is the fundamentally wrong approach.
Everyone accepts some moral claims as fundamental. I take it as a fundamental moral claim that when a perpetrator commits a serious wrong against someone it is the nature of the wrong (and perhaps the views of the person wronged, per Jenny’s comment) that determine the appropriate response. I don’t expect that everyone reading this comment will agree with this, and I don’t believe it’s always possible to argue someone into a moral view (I think at some fundamental level, we end up having to accept irreconcilable disagreements, as much as that frustrates the EA urge to be able to use reason to settle all matters).
(At this point, we could push into hypothetical scenarios like, “what if you were literally certain that if we reacted appropriately to the wrongdoing then everyone would be tortured forever?”. Would the consequences still be irrelevant? Perhaps not, but the fact of the matter is that we do not live in a hypothetical world. I will say this much: I think that the nature of the wrongdoing is the vastly dominating factor in determining how to respond to that wrongdoing. In realistic cases, it is powerful enough that we don’t need to reflect on the other considerations that carry less weight in this context.)
I’ve said I don’t expect to convince the consequentialists reading this to accept my view. What’s the point then? Perhaps I simply hope to make clear just how crucial an issue of moral conscience this is for some people. And perhaps I hope that this might at least push EA to consider a compromise that is more responsive to this matter of conscience.
I’m sorry you’ve found this conversation upsetting, and think it’s entirely reasonable to not want to continue it, so I’ll leave things here. I appreciate the openness, and you still being willing to express this opinion despite expecting to find the conversation upsetting!