The OP is not titled “An incomplete list of activities that EA orgs should think about before doing” it is “An incomplete list of activites that EA orgs Probably Shouldn’t Do”. I agree that most of the things listed in the OP seem reasonable to think about and take into account in a risk analysis, but I doubt the OP is actually contributing to people doing much more of that.
I would love a post that would go into more detail on “when each of these seems appropriate to me”, which seems much more helpful to me.
I think “probably shouldn’t” is fair. In most cases, these should be avoided. However, in scenarios with appropriate mitigations and/or thought given, they can be fine. You’ve given some examples of doing this.
In terms of the post, it has generated lots of good discussion about which of these norms the community might want to adopt or how they should be modified. Therefore, the post is valuable in its current form. Its a discussion starter IMO, not a list of things that should be cited in future.
The OP is not titled “An incomplete list of activities that EA orgs should think about before doing” it is “An incomplete list of activites that EA orgs Probably Shouldn’t Do”. I agree that most of the things listed in the OP seem reasonable to think about and take into account in a risk analysis, but I doubt the OP is actually contributing to people doing much more of that.
I would love a post that would go into more detail on “when each of these seems appropriate to me”, which seems much more helpful to me.
I think “probably shouldn’t” is fair. In most cases, these should be avoided. However, in scenarios with appropriate mitigations and/or thought given, they can be fine. You’ve given some examples of doing this.
In terms of the post, it has generated lots of good discussion about which of these norms the community might want to adopt or how they should be modified. Therefore, the post is valuable in its current form. Its a discussion starter IMO, not a list of things that should be cited in future.