There may not have been extended discussions, but there was at least one more recent warning. “E.A. leadership” is a nebulous term, but there is a small annual invitation-only gathering of senior figures, and they have conducted detailed conversations about potential public-relations liabilities in a private Slack group.
I don’t know about others, but I find it deeply uncomfortable there’s an invite-only conference and a private slack channel where, amongst other things, reputational issues are discussed. For one, there’s something weird about, on the one hand, saying “we should act with honesty and integrity” and also “oh, we have secret meetings where we discuss if other people are going to make us look bad”.
I think its completely fine for invite-only slacks to exist and for them to discuss matters that they might not want leaked elsewhere. If they were plotting murders or were implicated in serious financial crime, or criminal enterprise, or other such awful unforgiveable acts, then yes I can see why we would want to send a clear signal that anything like that is beyond the pale and discretion no longer protects you. In that instance I think no one would object to breach of trust.
However, we aren’t discussing that scenario. This is a breach of trust, which erodes honest discussion in private channels. The more this is acceptable in EA circles, the less honesty of opinion you will get—and the more paranoia will set in.
Acting with honesty and integrity does not mean opening up every discussion to the world, or having an expectation that chats will leak in the event that you discuss “if other people are going to make us look bad”. Nevermind the difficulty that arises in then attempting to predict what else warrants leaks if that’s the bar you’ve set.
The thing that most keenly worries me about this is the lack of openness and accountability of this. We are a social movement, so of course we will have power dynamics and leadership. But with no transparency or accountability, how can anyone know how to make change?
I think its wrong to say there’s no transparency or accountability (this isn’t to say we should just assume all checks now are enough, but I don’t think we should conclude that none so far exist). Obviously for anything actually criminal then proper whistleblowing paths exist and should be used! At the moment, I think even checks like this discussion are far more effective than in most other communities because EA is still quite small, so it hasn’t got the issues of scale that other institutions or communities may experience.
On transparency: Transparency is a part of honesty, but has costs and I don’t think its at all clear in this instance that that cost was remotely required to be paid. Again, this will only cause future discussions to be slower, more guarded and less honest—the community response to this will similarly decide how much we should guard ourselves when talking with other EAs. As a side point: its also the case that this instance isn’t actual “transparency” but fed lines to a journalist, then selectively quoted and given back to us.
The cost of transparency in every discussion at a high-level of leadership (for example) is that the cost of new ideas becomes prohibitively high as everyone can pick you apart, weigh in, misrepresent or re-direct discussion entirely. Compare e.g. local council meetings with the public and those without, and decisions made in committee vs those made by individual founders. Again transparency is a part of honesty but I can put my trust in you—for example—without needing you to be transparent about every conversation you have about me. If, however, the norm is that we expect total transparency of information and constant leaks—then we should expect a community of paranoia, dishonest conversation and continuous misrepresentations of one another.
I think you may be assuming what I am calling for here is much more wide-ranging. There still doesn’t seem to be good justification for not knowing who is in the coordination forum or on these leadership slack channels. Making the structures that actually exist apparent to community members would probably not come at such a prohibitively high cost as you suggest
I don’t know about others, but I find it deeply uncomfortable there’s an invite-only conference and a private slack channel where, amongst other things, reputational issues are discussed. For one, there’s something weird about, on the one hand, saying “we should act with honesty and integrity” and also “oh, we have secret meetings where we discuss if other people are going to make us look bad”.
I think its completely fine for invite-only slacks to exist and for them to discuss matters that they might not want leaked elsewhere. If they were plotting murders or were implicated in serious financial crime, or criminal enterprise, or other such awful unforgiveable acts, then yes I can see why we would want to send a clear signal that anything like that is beyond the pale and discretion no longer protects you. In that instance I think no one would object to breach of trust.
However, we aren’t discussing that scenario. This is a breach of trust, which erodes honest discussion in private channels. The more this is acceptable in EA circles, the less honesty of opinion you will get—and the more paranoia will set in.
Acting with honesty and integrity does not mean opening up every discussion to the world, or having an expectation that chats will leak in the event that you discuss “if other people are going to make us look bad”. Nevermind the difficulty that arises in then attempting to predict what else warrants leaks if that’s the bar you’ve set.
The thing that most keenly worries me about this is the lack of openness and accountability of this. We are a social movement, so of course we will have power dynamics and leadership. But with no transparency or accountability, how can anyone know how to make change?
I think its wrong to say there’s no transparency or accountability (this isn’t to say we should just assume all checks now are enough, but I don’t think we should conclude that none so far exist). Obviously for anything actually criminal then proper whistleblowing paths exist and should be used! At the moment, I think even checks like this discussion are far more effective than in most other communities because EA is still quite small, so it hasn’t got the issues of scale that other institutions or communities may experience.
On transparency: Transparency is a part of honesty, but has costs and I don’t think its at all clear in this instance that that cost was remotely required to be paid. Again, this will only cause future discussions to be slower, more guarded and less honest—the community response to this will similarly decide how much we should guard ourselves when talking with other EAs. As a side point: its also the case that this instance isn’t actual “transparency” but fed lines to a journalist, then selectively quoted and given back to us.
The cost of transparency in every discussion at a high-level of leadership (for example) is that the cost of new ideas becomes prohibitively high as everyone can pick you apart, weigh in, misrepresent or re-direct discussion entirely. Compare e.g. local council meetings with the public and those without, and decisions made in committee vs those made by individual founders. Again transparency is a part of honesty but I can put my trust in you—for example—without needing you to be transparent about every conversation you have about me. If, however, the norm is that we expect total transparency of information and constant leaks—then we should expect a community of paranoia, dishonest conversation and continuous misrepresentations of one another.
I think you may be assuming what I am calling for here is much more wide-ranging. There still doesn’t seem to be good justification for not knowing who is in the coordination forum or on these leadership slack channels. Making the structures that actually exist apparent to community members would probably not come at such a prohibitively high cost as you suggest
Yes, I don’t know what I think of that, but you’re right that I implied you were thinking of something much more wide reaching.