We will also never know about serious issues if people are too afraid to speak up in a way that can be trusted and acted on. Creating a burner account out of fear might be a psychologically understandable reaction (although I suspect its prevalence is overstated), but it is not an effective or tactically appropriate reaction. Burner account accusations get upvotes and public sympathy but they don’t accomplish much else. Actual change requires someone to stick their neck out, whether that might be in a public post or in influential backchannels. There is no substitute for courage.
An anonymous accusation on it’s own will not make a difference. But that’s not the utility of it. The point is to get the ball rolling: to shift the environment so as to make it easier for people to speak up.
Suppose a prominent figure X has engaged in unethical behavior. Employee A knows about the behaviour, but can’t speak up about it without being retaliated against. Persons B and C have each seen some sketchy behaviour by X, but are unsure if it was just a one-off. Person D is a respected figure that would be horrified by X’s actions, but are unaware of them.
If A posts an anonymous accusation, B and C can realize that their sketchy observations match up with the account, and lend it credibility, prompting D to investigate the issue independently and confirming that the account is correct, prompting X to be kicked out. None of this would have happened without the anonymous whistleblowing from A.
Whistleblowing on bad behavior is good. I encourage people to do so publicly, but if you are unwilling or unable to do so, doing so anonymously is the second best option.
Can you name examples of this working? Because I’ve seen a good number of anonymous public accusations on this forum and I don’t recall any that led to the outcome you describe. I understand this theory of change but it sure doesn’t seem to work that way in real life.
In contrast I know of many cases where backchannel reporting to trusted third parties has led to results. If someone is not willing to speak up publicly, then using whisper networks or official reporting channels has a much better track record compared to making burner accusations on the EA forum. I am somewhat worried about people making an ineffective burner account post and feeling like they’ve done their job when otherwise they would’ve mustered up their courage and told the conference organizer.
One concern is that you’re calling on people to internalize the costs of bringing serious issues to light—which many believe is considerable—while the benefits are socialized. That’s not a good incentive structure.
It’s fine to call people to acts of supererogatory self-sacrifice, but I don’t think we should tell them they should either go down that path or remain entirely silent.
Use of trusted third parties (TTPs) is a possibility but has some significant limitations. I should write a post about TTPs acting under limited non-disclosure agreements at some point. Their use could alleviate some, but not all, of the disadvantages of burners.
In my experience anonymous accounts work fine? Whats important is having the information in public. Whether the account is anonymous or not isn’t very predictive of whether effective change occurs. For example Brent was defended by CFAR, but got kicked out once anonymous accounts were posted publicly.
I actually do know the real names of the people who wrote about Brent. It’s one of those “community insiders know who they were but it’s hard to tell from the outside” situations, like the one I described with pre-doxxing Scott Alexander. If the authors had been anonymous for real then I don’t think it would’ve worked anywhere near as well. This approach avoids most of the downsides of actually-unknown-and-unaccountable burner accounts and I do not object to it.
We will also never know about serious issues if people are too afraid to speak up in a way that can be trusted and acted on. Creating a burner account out of fear might be a psychologically understandable reaction (although I suspect its prevalence is overstated), but it is not an effective or tactically appropriate reaction. Burner account accusations get upvotes and public sympathy but they don’t accomplish much else. Actual change requires someone to stick their neck out, whether that might be in a public post or in influential backchannels. There is no substitute for courage.
An anonymous accusation on it’s own will not make a difference. But that’s not the utility of it. The point is to get the ball rolling: to shift the environment so as to make it easier for people to speak up.
Suppose a prominent figure X has engaged in unethical behavior. Employee A knows about the behaviour, but can’t speak up about it without being retaliated against. Persons B and C have each seen some sketchy behaviour by X, but are unsure if it was just a one-off. Person D is a respected figure that would be horrified by X’s actions, but are unaware of them.
If A posts an anonymous accusation, B and C can realize that their sketchy observations match up with the account, and lend it credibility, prompting D to investigate the issue independently and confirming that the account is correct, prompting X to be kicked out. None of this would have happened without the anonymous whistleblowing from A.
Whistleblowing on bad behavior is good. I encourage people to do so publicly, but if you are unwilling or unable to do so, doing so anonymously is the second best option.
Can you name examples of this working? Because I’ve seen a good number of anonymous public accusations on this forum and I don’t recall any that led to the outcome you describe. I understand this theory of change but it sure doesn’t seem to work that way in real life.
In contrast I know of many cases where backchannel reporting to trusted third parties has led to results. If someone is not willing to speak up publicly, then using whisper networks or official reporting channels has a much better track record compared to making burner accusations on the EA forum. I am somewhat worried about people making an ineffective burner account post and feeling like they’ve done their job when otherwise they would’ve mustered up their courage and told the conference organizer.
One concern is that you’re calling on people to internalize the costs of bringing serious issues to light—which many believe is considerable—while the benefits are socialized. That’s not a good incentive structure.
It’s fine to call people to acts of supererogatory self-sacrifice, but I don’t think we should tell them they should either go down that path or remain entirely silent.
Use of trusted third parties (TTPs) is a possibility but has some significant limitations. I should write a post about TTPs acting under limited non-disclosure agreements at some point. Their use could alleviate some, but not all, of the disadvantages of burners.
In my experience anonymous accounts work fine? Whats important is having the information in public. Whether the account is anonymous or not isn’t very predictive of whether effective change occurs. For example Brent was defended by CFAR, but got kicked out once anonymous accounts were posted publicly.
I actually do know the real names of the people who wrote about Brent. It’s one of those “community insiders know who they were but it’s hard to tell from the outside” situations, like the one I described with pre-doxxing Scott Alexander. If the authors had been anonymous for real then I don’t think it would’ve worked anywhere near as well. This approach avoids most of the downsides of actually-unknown-and-unaccountable burner accounts and I do not object to it.