feel bad for piling on, but I want to copy over my note from slack because I think it is a succinct epistemology concern and less comprehensive than the other comments:
idk what channel is best for this comment, which I hesitate to make, because I share the broad goals (besides one nagging detail) of the document and don’t wanna be that guy and it’s not my hill to die on, and etc. etc. I know some people will feel like this comment is a call to relitigate some object level thing that a lot of people don’t even want to be in the overton window, and I’m sorry.
but I think it might be poisonous to precommit against science. believing true things is dual use. empirical beliefs are not assigned any moral status whatsoever. I don’t care a lot about the object level here because it’s not morally relevant, and it’s only tactically relevant for things way outside my wheelhouse. But a culture that says “if you’re investigating this mindkilled empirical topic that vanishingly few people have real expertise on, you’re on thin ice, because a priori we know there’s a right answer and a wrong answer socially speaking” is alarming and kinda anti-EA. Pointing to hypothetical harms that can be downstream of beliefs propagating (by belief I mean in the strictest sense of an empirical and falsifiable map of the territory) doesn’t get you out of that for free.
source: co-run EA Philly with someone. my diversity credentials: used to tutor math at a community college, was highly involved BLMer 2014-2016
For the record: Duncan’s comment may have swayed me more to the harms of virtue signaling, making me more negative about the statement than I was when I chimed in on slack.
feel bad for piling on, but I want to copy over my note from slack because I think it is a succinct epistemology concern and less comprehensive than the other comments:
idk what channel is best for this comment, which I hesitate to make, because I share the broad goals (besides one nagging detail) of the document and don’t wanna be that guy and it’s not my hill to die on, and etc. etc. I know some people will feel like this comment is a call to relitigate some object level thing that a lot of people don’t even want to be in the overton window, and I’m sorry.
but I think it might be poisonous to precommit against science. believing true things is dual use. empirical beliefs are not assigned any moral status whatsoever. I don’t care a lot about the object level here because it’s not morally relevant, and it’s only tactically relevant for things way outside my wheelhouse. But a culture that says “if you’re investigating this mindkilled empirical topic that vanishingly few people have real expertise on, you’re on thin ice, because a priori we know there’s a right answer and a wrong answer socially speaking” is alarming and kinda anti-EA. Pointing to hypothetical harms that can be downstream of beliefs propagating (by belief I mean in the strictest sense of an empirical and falsifiable map of the territory) doesn’t get you out of that for free.
source: co-run EA Philly with someone. my diversity credentials: used to tutor math at a community college, was highly involved BLMer 2014-2016
For the record: Duncan’s comment may have swayed me more to the harms of virtue signaling, making me more negative about the statement than I was when I chimed in on slack.