Like many people, I’ve been following this thread with dismay. I think that Frances’s experiences sound terrible, and seem very unnecessary.
I have hesitated to weigh in on this thread. But I agree that the answers can’t just be at the policy level; and I’m keen to see further discussion about cultural dynamics which may contribute to the issues[1]. At this point I’ve given this question a good amount of thought (though I could definitely still be wrong), so I wanted to highlight a couple of things people might want to consider:
Focus on intent
I’m glad Frances calls this out as a problem, as I think it’s underappreciated as a contributing factor to problematic dynamics. I actually think it has more issues beyond what she lists.
A focus on intent:
Gets in the way of straightforwardly talking about what kinds of behaviour are good or bad
Moves attention from “person X had bad experiences; what happened there and how could they have been avoided?” to “is person Y bad?” (which is liable to lead to people coming to Y’s defence, in a way that risks invalidating X’s experiences)
Sends the (potentially dangerous) message “if your intentions are good, there’s nothing to worry about”
Distrust of moral intuitions
(caveat: not sure I’m naming the truest version of this; but I’m pretty sure there’s something in this vicinity)
I think EA teaches people that it’s important to think through the implications of our actions, rather than relying on unconsidered moral intuitions. Which is correct! But I worry that sometimes people can absorb this lesson too far, and start not paying attention to their own moral intuitions when they don’t have explicit arguments for them[2].
A friend put it to me as “I think sometimes EA accidentally encourages a lack of groundedness”.
Anyway it’s pure speculation on my part to imagine this at play in CEA’s (in)actions. But rather than imagine that the people reading Riley’s document didn’t feel any discomfort, I find it easier to imagine them feeling a little uncomfortable about it but not trusting the discomfort, or orienting in a locally-consequentialist way and guessing that it would ultimately create more costs and be worse (possibly including worse-for-Frances) to escalate it rather than leave it be.
TBC, I don’t think that the right amount of focus on intent or distrust of our own moral intuitions is zero! And I absolutely think that it’s possible to do these in ways that are healthy. But if I’m right, then I kind of want people to be tracking the potential vulnerabilities from going too far in these directions; so wanted to share. I’ll default to not posting more on this thread.
For the removal of any ambiguity, I’m not trying to disclaim personal responsibility for my own past mistakes! But when things go wrong to the degree of causing harm, I think they’ve often gone wrong at several levels at once; it’s useful to look at all of these.
I would go further, and say that given CEA’s specific history and promises of change around sexual harassment[1], we should hold them to an even higher standard than that.
CEA was and is a member organisation of EV UK, and the findings partially concerned CEA’s Community Health Team
I am sympathetic to this.
Like many people, I’ve been following this thread with dismay. I think that Frances’s experiences sound terrible, and seem very unnecessary.
I have hesitated to weigh in on this thread. But I agree that the answers can’t just be at the policy level; and I’m keen to see further discussion about cultural dynamics which may contribute to the issues[1]. At this point I’ve given this question a good amount of thought (though I could definitely still be wrong), so I wanted to highlight a couple of things people might want to consider:
Focus on intent
I’m glad Frances calls this out as a problem, as I think it’s underappreciated as a contributing factor to problematic dynamics. I actually think it has more issues beyond what she lists.
A focus on intent:
Gets in the way of straightforwardly talking about what kinds of behaviour are good or bad
Moves attention from “person X had bad experiences; what happened there and how could they have been avoided?” to “is person Y bad?” (which is liable to lead to people coming to Y’s defence, in a way that risks invalidating X’s experiences)
Sends the (potentially dangerous) message “if your intentions are good, there’s nothing to worry about”
Distrust of moral intuitions
(caveat: not sure I’m naming the truest version of this; but I’m pretty sure there’s something in this vicinity)
I think EA teaches people that it’s important to think through the implications of our actions, rather than relying on unconsidered moral intuitions. Which is correct! But I worry that sometimes people can absorb this lesson too far, and start not paying attention to their own moral intuitions when they don’t have explicit arguments for them[2].
A friend put it to me as “I think sometimes EA accidentally encourages a lack of groundedness”.
Anyway it’s pure speculation on my part to imagine this at play in CEA’s (in)actions. But rather than imagine that the people reading Riley’s document didn’t feel any discomfort, I find it easier to imagine them feeling a little uncomfortable about it but not trusting the discomfort, or orienting in a locally-consequentialist way and guessing that it would ultimately create more costs and be worse (possibly including worse-for-Frances) to escalate it rather than leave it be.
TBC, I don’t think that the right amount of focus on intent or distrust of our own moral intuitions is zero! And I absolutely think that it’s possible to do these in ways that are healthy. But if I’m right, then I kind of want people to be tracking the potential vulnerabilities from going too far in these directions; so wanted to share. I’ll default to not posting more on this thread.
For the removal of any ambiguity, I’m not trying to disclaim personal responsibility for my own past mistakes! But when things go wrong to the degree of causing harm, I think they’ve often gone wrong at several levels at once; it’s useful to look at all of these.
Or further: discount their own sense of right and wrong in order to defer to people who’ve thought about things more.