I’m not too confident about this, but one reason you may not have heard about men being held accountable in EA is that it’s not the sort of thing you necessarily publicize. For example, I helped a friend who was raped by a member of the AI safety research community. He blocked her on LessWrong, then posted a deceptive self-vindicating article mischaracterizing her and patting himself on the back.
I told her what was going on and helped her post her response once she’d crafted it via my account. Downvotes ensued for the guy. Eventually he deleted the post.
That’s one example of what (very partial) accountability looks like, but the end result in this case was a decrease in visibility for an anti-accountability post. And except for this thread, I’m not going around talking about my involvement in the situation.
I don’t know how much of the imbalance this accounts for, nor am I claiming that everything is fine. It’s just something to keep in mind as one aspect of parsing the situation.
Thank you, yeah I think I may be overindexing on a few public examples (not being privy to the private examples that you and others in thread have brought up). Glad to hear that there are plenty of examples of the community responding well to protect victims/survivors.
I still also don’t think everything’s fine, but unsure to what extent EA is worse than the rest of the world, where things are also not fine on this front.
I wonder if it would be helpful to have some kind of (heavily anonymized, e.g. summarizing across years) summary statistics about the number of such incidents brought up to CEA community health (since they are the main group collecting such info) and how they were dealt with / what victims choose to do to balance out the public accounts.
Yeah I think it does! It might be good to highlight that in a way more people would read it (e.g. I read that post + the appendix but forgot it was there!)
I’m strongly in favour of this—it often feels like the need is to make this public so it becomes something the entire community is responsible for—as opposed to how it currently is (private and something CEA’s comm health mainly is responsible for).
I still also don’t think everything’s fine, but unsure to what extent EA is worse than the rest of the world, where things are also not fine on this front.
FWIW this is exactly how I feel about gender-based issues in EA!
I’m not too confident about this, but one reason you may not have heard about men being held accountable in EA is that it’s not the sort of thing you necessarily publicize. For example, I helped a friend who was raped by a member of the AI safety research community. He blocked her on LessWrong, then posted a deceptive self-vindicating article mischaracterizing her and patting himself on the back.
I told her what was going on and helped her post her response once she’d crafted it via my account. Downvotes ensued for the guy. Eventually he deleted the post.
That’s one example of what (very partial) accountability looks like, but the end result in this case was a decrease in visibility for an anti-accountability post. And except for this thread, I’m not going around talking about my involvement in the situation.
I don’t know how much of the imbalance this accounts for, nor am I claiming that everything is fine. It’s just something to keep in mind as one aspect of parsing the situation.
Thank you, yeah I think I may be overindexing on a few public examples (not being privy to the private examples that you and others in thread have brought up). Glad to hear that there are plenty of examples of the community responding well to protect victims/survivors.
I still also don’t think everything’s fine, but unsure to what extent EA is worse than the rest of the world, where things are also not fine on this front.
I wonder if it would be helpful to have some kind of (heavily anonymized, e.g. summarizing across years) summary statistics about the number of such incidents brought up to CEA community health (since they are the main group collecting such info) and how they were dealt with / what victims choose to do to balance out the public accounts.
Does the appendix in Julia’s post here do what you’re looking for?
https://forum.effectivealtruism.org/posts/NbkxLDECvdGuB95gW/the-community-health-team-s-work-on-interpersonal-harm-in
Yeah I think it does! It might be good to highlight that in a way more people would read it (e.g. I read that post + the appendix but forgot it was there!)
I’m strongly in favour of this—it often feels like the need is to make this public so it becomes something the entire community is responsible for—as opposed to how it currently is (private and something CEA’s comm health mainly is responsible for).
FWIW this is exactly how I feel about gender-based issues in EA!