I think individuals and institutions in EA need to do a better job of mitigating risks created by unequal power dynamics. In a previous job, I conducted research related to institutional accountability and sexual assault. One common theme is that the way that institutions and communities respond to bad behavior by key figures is shaped by their norms and systems, with certain attributes making accountability more difficult to achieve. In my opinion, there are several aspects of the EA movement as it currently exists—including the blurrier work/life boundaries for many folks, the outsize power of certain community leaders, the frequent reliance on ad-hoc rather than formal systems, and the movement’s small size—that make accountability particularly difficult, and I don’t feel that we have done enough to create systems that respond to these risks.
Let’s think through an example. (To be clear, this is entirely hypothetical.) Imagine that a woman is harassed by a prominent community leader. She works for a small EA org. Her boss is close friends with her harasser, and the org receives significant funding from his organization. She wants to say something, but she doesn’t want to threaten her job or their funding. Not only that, but most of her friends are in EA circles, and she knows speaking up would be divisive.
Some of the things that make this sort of situation more difficult in EA are based on parts of the community that would be difficult or undesirable to change. But some of them are worth changing, and the existence of all of them makes the creation of robust systems even more important.
I think it’s useful for institutions to think through these sorts of exercises. What if a major donor was engaging in bad behavior? An organization’s leader? To what extent would victims feel able to come forward? How likely would it be that the victim would face negative consequences from speaking up, vs. that the perpetrator would face real consequences?
There are always going to be bad actors. It’s up to communities and institutions to set up systems so that when improper behavior occurs (whether harassment, assault, etc.), it is more likely that bad actors will face accountability for their actions. With rare exceptions, the deck is stacked against the victim and towards the perpetrator; good systems can help reduce how strongly the deck is stacked.
What do these good systems look like?
Well-publicized, accessible systems (within orgs, community spaces, and events) that allow people to report incidents of improper behavior
Clear policies for how institutions will respond to reports, including how they will maintain confidentiality
Thoughtful procedures for reducing the likelihood of retaliation
Explicit conflict of interest policies for orgs and grantmakers
I think individuals and institutions in EA need to do a better job of mitigating risks created by unequal power dynamics. In a previous job, I conducted research related to institutional accountability and sexual assault. One common theme is that the way that institutions and communities respond to bad behavior by key figures is shaped by their norms and systems, with certain attributes making accountability more difficult to achieve. In my opinion, there are several aspects of the EA movement as it currently exists—including the blurrier work/life boundaries for many folks, the outsize power of certain community leaders, the frequent reliance on ad-hoc rather than formal systems, and the movement’s small size—that make accountability particularly difficult, and I don’t feel that we have done enough to create systems that respond to these risks.
Let’s think through an example. (To be clear, this is entirely hypothetical.) Imagine that a woman is harassed by a prominent community leader. She works for a small EA org. Her boss is close friends with her harasser, and the org receives significant funding from his organization. She wants to say something, but she doesn’t want to threaten her job or their funding. Not only that, but most of her friends are in EA circles, and she knows speaking up would be divisive.
Some of the things that make this sort of situation more difficult in EA are based on parts of the community that would be difficult or undesirable to change. But some of them are worth changing, and the existence of all of them makes the creation of robust systems even more important.
I think it’s useful for institutions to think through these sorts of exercises. What if a major donor was engaging in bad behavior? An organization’s leader? To what extent would victims feel able to come forward? How likely would it be that the victim would face negative consequences from speaking up, vs. that the perpetrator would face real consequences?
There are always going to be bad actors. It’s up to communities and institutions to set up systems so that when improper behavior occurs (whether harassment, assault, etc.), it is more likely that bad actors will face accountability for their actions. With rare exceptions, the deck is stacked against the victim and towards the perpetrator; good systems can help reduce how strongly the deck is stacked.
What do these good systems look like?
Well-publicized, accessible systems (within orgs, community spaces, and events) that allow people to report incidents of improper behavior
Clear policies for how institutions will respond to reports, including how they will maintain confidentiality
Thoughtful procedures for reducing the likelihood of retaliation
Explicit conflict of interest policies for orgs and grantmakers
Robust governance systems
This is just a start, but hopefully a helpful one! These are conversations worth having.