Summary:Excessive privacy measures on a forum for a movement that otherwise claims to be among the most transparent of all movements may easily provoke increased efforts by, for example, journalists to gain access to that information. A process by which some sensitive information is trusted as secured may also provoke a false sense that information hazards of global significance are secured by the same means. That is insufficient, so there should be multiple procedures or protocols for securing different kinds of information actors in EA seek to keep private or secret. Discourse actors in EA seek to keep private should in general not be conducted on the EA Forum.
This is similar to more concern I’ve noticed being expressed in the last year over the ambiguous relationship effective altruism has with news media and (various interest groups among) the public as the movement has become more prominent in the last few years.
One potential problem to bear in mind is the Streisand effect. It’s “a phenomenon that occurs when an attempt to hide, remove, or censor information has the unintended consequence of increasing awareness of that information, often via the Internet.” [sic] The more effective altruism is suspected of hiding, the harder more people may try to gain access to that information.
There isn’t anything that prevents anyone from outside of EA from starting an account on the EA Forum. Limiting access to different sensitive discussions on the EA Forum to only some users with accounts may send an errant signal incentivizing those seeking private information to keep trying more. It may also arbitrarily limit transparency of discourse within EA itself.
Another problem is information hazards and other kinds of information many in EA prefer remain private are conflated with each other. I do respect concerns the negative publicization of some information privy to EA-affiliated organizations may have a dire impact on their capacity to do good. Yet a potential scandal about a political candidate or large sums of money can be resolved while the exposure of information dramatically increasing existential risk probably can’t be.
Any process by which some sensitive information in EA is trusted to be secure may generate a very risky and false sense of security it will be sufficient to also secure genuine information hazards. To minimize that risk, there should be distinct sets of procedures and protocols for securing information hazards, and all other kinds of information actors in EA may otherwise prefer to remain private.
Summary: Excessive privacy measures on a forum for a movement that otherwise claims to be among the most transparent of all movements may easily provoke increased efforts by, for example, journalists to gain access to that information. A process by which some sensitive information is trusted as secured may also provoke a false sense that information hazards of global significance are secured by the same means. That is insufficient, so there should be multiple procedures or protocols for securing different kinds of information actors in EA seek to keep private or secret. Discourse actors in EA seek to keep private should in general not be conducted on the EA Forum.
This is similar to more concern I’ve noticed being expressed in the last year over the ambiguous relationship effective altruism has with news media and (various interest groups among) the public as the movement has become more prominent in the last few years.
One potential problem to bear in mind is the Streisand effect. It’s “a phenomenon that occurs when an attempt to hide, remove, or censor information has the unintended consequence of increasing awareness of that information, often via the Internet.” [sic] The more effective altruism is suspected of hiding, the harder more people may try to gain access to that information.
There isn’t anything that prevents anyone from outside of EA from starting an account on the EA Forum. Limiting access to different sensitive discussions on the EA Forum to only some users with accounts may send an errant signal incentivizing those seeking private information to keep trying more. It may also arbitrarily limit transparency of discourse within EA itself.
Another problem is information hazards and other kinds of information many in EA prefer remain private are conflated with each other. I do respect concerns the negative publicization of some information privy to EA-affiliated organizations may have a dire impact on their capacity to do good. Yet a potential scandal about a political candidate or large sums of money can be resolved while the exposure of information dramatically increasing existential risk probably can’t be.
Any process by which some sensitive information in EA is trusted to be secure may generate a very risky and false sense of security it will be sufficient to also secure genuine information hazards. To minimize that risk, there should be distinct sets of procedures and protocols for securing information hazards, and all other kinds of information actors in EA may otherwise prefer to remain private.