I agree it’s good for a community to have an immune system that deters people who would hurt its main goals, EA included. But, and I hear you do care about calibrating on this too, we want to avoid false positives. Irving below seems like an example, and he said it better than I could: we’re already leaving lots of value on the table. I expect our disagreement is just empirical and about that, so happy to leave it here as it’s only tangentially relevant to the OP.
Aside: I don’t know about Will’s intentions, I just read his comment and your reply, and don’t think ‘he could have made a different comment’ is good evidence of his intentions. I’m going to assume you know much more about the situation/background than I do, but if not I do think it’s important to give people benefit of the doubt on the question of intentions.
[Meta: in case not obvious, I want to round off this thread, happy to chat in private sometime]
I agree it’s good for a community to have an immune system that deters people who would hurt its main goals, EA included. But, and I hear you do care about calibrating on this too, we want to avoid false positives. Irving below seems like an example, and he said it better than I could: we’re already leaving lots of value on the table. I expect our disagreement is just empirical and about that, so happy to leave it here as it’s only tangentially relevant to the OP.
Aside: I don’t know about Will’s intentions, I just read his comment and your reply, and don’t think ‘he could have made a different comment’ is good evidence of his intentions. I’m going to assume you know much more about the situation/background than I do, but if not I do think it’s important to give people benefit of the doubt on the question of intentions.
[Meta: in case not obvious, I want to round off this thread, happy to chat in private sometime]