Upvoted. Thanks for taking putting in the time for the thoughtful response. I wasn’t sure whether the message I was trying to get across would land when I was writing this post, so your comment confirms that.
It seems as though a standard media strategy of EAs is, if someone publishes a hit piece on us somewhere, ignore it and “respond” by presenting EA ideas better to begin with elsewhere. This is a way of being positive rather than negative in interactions, and avoiding signal-boosting bad criticisms. I don’t know how to explain how I have such a different impression, or why so many smart people seem to disagree with me, but this looks to me like an intuitively terrible, obvious mistake.
I agree with you that’s the wrong kind of response to news media. I intended my post to not be about criticisms in news media, and misconceptions they might provoke, but posts by those already in the community on their own social media.
I used Scott’s post as an example because it’s similar to the kind of post I’m talking about. I thought it might not land properly because Scott’s blog is so big it’s effect may be more comparable to the impact of a newspaper than the personal blog or social media feed of just any effective altruist. It turns out I was right it wouldn’t land, so it’s my mistake the point I made got muddled.
Anyway, to reiterate, I think:
Individuals already in EA who write informal criticisms for the community itself should consider posting on the EA Forum more even if they feel it may not be appropriate.
Why is because those criticisms being run through a central locus allows for us in EA who are most able to verify correct or incorrect info about EA, as the cost of false info about sensitive subjects on EA among the random public may outweigh the cost of social risks, real or perceived, of posting on the EA Forum.
Some things your comment gets at that I should have been explicit about:
Centralizing more internal discourse to a single locus like the EA Forum is something I’m only suggesting needs to happen on the EA Forum for internal discourse. Community members having to constantly correct each other and the misconceptions we ourselves provoke is an unnecessary redundancy. Dealing with that more efficiently can free up time and attention to focus on external criticisms properly, like you suggest. Criticisms in mainstream/news media, or from outside EA entirely, shouldn’t be dealt with that way.
Community members shouldn’t be discouraged from sharing candid opinions of their own elsewhere online but it’d be preferable if they were shared on the EA Forum too more. A lot of valuable information that can become common knowledge is lost when it just sits on Twitter or Facebook.
Surely there are some portion of criticisms[...]that are hard to be persuasive against but are still bad, but we shouldn’t orient the movement’s entire media strategy around those.
With all the different kinds of criticism and strategizing how to respond to them, one of my points is that what’s lost is responses to criticisms that aren’t bad, but based on false premises, like that EA has in practice only been inclusive of utilitarianism, or may be incompatible with religion. Those are the easiest kind of misconceptions to dispel but there isn’t much focus on them at all.
If some EA ever had the opportunity to write a high-quality response like Avital’s, or to be blunt almost any okay response, to the Torres piece in Aeon or Current Affairs, or for that matter in WSJ to their recent hit piece, I think it would be a really really good idea to do so, the EA forum is not a good enough media strategy.
I agree with this. I’ve thought about it before but I’ve felt skeptical publications would be receptive. I’m not aware of many in EA who’ve tried anything like that, so it could be worth a shot to submit to, say, Aeon. It’s better that they be posted on the EA Forum than nowhere else prominent online. For what it’s worth, too, I’ve been thinking of writing more direct responses to mainstream criticism of EA, kind of like I started trying to in this comment.
The one caveat I’ve got left is that, as to some of the ‘hit’ pieces, there are some of us in EA who are in the awkward positions of not seeing them as hit pieces. They’re perceived as somewhat fair coverage making some bad but also good points about EA in need of addressing. That there are problems in EA in need of change is a different kind of reason for addressing external criticism on the EA Forum.
Thanks for the gracious response, and apologies for misunderstanding. I think I still disagree with parts of your post. I disagree with parts of his piece and think that Alexander could have done a better job getting background on Remmelt’s piece before singling it out (and I also think it would have been a good idea for him to crosspost it to the forum, although he was uncomfortable with this), but I still think the piece was a net benefit as written, and didn’t harm EA, or Remmelt himself, to any degree that we should be especially worried about. I do think interaction with the forum is generally beneficial, both for insiders and interested outsiders, but I’m not nearly so worried about the costs of publishing things about EA off the forum, and think many of the problems with doing this that exist in at least some cases are self-inflicted by current EA norms I would rather challenge instead.
Upvoted. No need to apologize because your criticism was valid on the basis of how I presented my case, which didn’t leave my main arguments particularly clear.
I still think the piece was a net benefit as written, and didn’t harm EA, or Remmelt himself, to any degree that we should be especially worried about.
Yeah, I haven’t read the comments on Scott’s follow-up post yet because there were not many when I first noticed the post. I’m guessing there are more comments now and some themes among the reactions that may serve as an indicator as to the ways Scott’s post have led to more or less accurate understandings of EA.
I expect its ultimate impact will be closer to neutral than significantly positive or negative. I’m guessing that any downside of people put off by EA would only be a few people anyway. Public communication on this subject 3+ meta levels in might be intellectually interesting but in practice it’s too abstract to be high-stakes for EA.
Higher stakes, like the perception of a controversial topic in AI safety/alignment among social or professional networks adjacent to EA, might be a risk worthier of considering for a post like this. Scott himself has for years handled controversies in AI alignment like this better than most in the community. I’m more concerned about those in the community who aren’t as deft as Scott not being skillful enough to avoid the mistakes in public communication about EA he is relatively competent at avoiding.
many of the problems with doing this that exist in at least some cases are self-inflicted by current EA norms I would rather challenge instead.
I don’t have as strong a sense yet as to what exactly are the main causes of this but in general I get the same impression.
Interestingly, we might not disagree on very much after all. I probably did too much pattern matching between your writing and broader impressions I get about EA media strategies. Still, glad we got to chat it out!
Upvoted. Thanks for taking putting in the time for the thoughtful response. I wasn’t sure whether the message I was trying to get across would land when I was writing this post, so your comment confirms that.
I agree with you that’s the wrong kind of response to news media. I intended my post to not be about criticisms in news media, and misconceptions they might provoke, but posts by those already in the community on their own social media.
I used Scott’s post as an example because it’s similar to the kind of post I’m talking about. I thought it might not land properly because Scott’s blog is so big it’s effect may be more comparable to the impact of a newspaper than the personal blog or social media feed of just any effective altruist. It turns out I was right it wouldn’t land, so it’s my mistake the point I made got muddled.
Anyway, to reiterate, I think:
Individuals already in EA who write informal criticisms for the community itself should consider posting on the EA Forum more even if they feel it may not be appropriate.
Why is because those criticisms being run through a central locus allows for us in EA who are most able to verify correct or incorrect info about EA, as the cost of false info about sensitive subjects on EA among the random public may outweigh the cost of social risks, real or perceived, of posting on the EA Forum.
Some things your comment gets at that I should have been explicit about:
Centralizing more internal discourse to a single locus like the EA Forum is something I’m only suggesting needs to happen on the EA Forum for internal discourse. Community members having to constantly correct each other and the misconceptions we ourselves provoke is an unnecessary redundancy. Dealing with that more efficiently can free up time and attention to focus on external criticisms properly, like you suggest. Criticisms in mainstream/news media, or from outside EA entirely, shouldn’t be dealt with that way.
Community members shouldn’t be discouraged from sharing candid opinions of their own elsewhere online but it’d be preferable if they were shared on the EA Forum too more. A lot of valuable information that can become common knowledge is lost when it just sits on Twitter or Facebook.
With all the different kinds of criticism and strategizing how to respond to them, one of my points is that what’s lost is responses to criticisms that aren’t bad, but based on false premises, like that EA has in practice only been inclusive of utilitarianism, or may be incompatible with religion. Those are the easiest kind of misconceptions to dispel but there isn’t much focus on them at all.
I agree with this. I’ve thought about it before but I’ve felt skeptical publications would be receptive. I’m not aware of many in EA who’ve tried anything like that, so it could be worth a shot to submit to, say, Aeon. It’s better that they be posted on the EA Forum than nowhere else prominent online. For what it’s worth, too, I’ve been thinking of writing more direct responses to mainstream criticism of EA, kind of like I started trying to in this comment.
The one caveat I’ve got left is that, as to some of the ‘hit’ pieces, there are some of us in EA who are in the awkward positions of not seeing them as hit pieces. They’re perceived as somewhat fair coverage making some bad but also good points about EA in need of addressing. That there are problems in EA in need of change is a different kind of reason for addressing external criticism on the EA Forum.
Thanks for the gracious response, and apologies for misunderstanding. I think I still disagree with parts of your post. I disagree with parts of his piece and think that Alexander could have done a better job getting background on Remmelt’s piece before singling it out (and I also think it would have been a good idea for him to crosspost it to the forum, although he was uncomfortable with this), but I still think the piece was a net benefit as written, and didn’t harm EA, or Remmelt himself, to any degree that we should be especially worried about. I do think interaction with the forum is generally beneficial, both for insiders and interested outsiders, but I’m not nearly so worried about the costs of publishing things about EA off the forum, and think many of the problems with doing this that exist in at least some cases are self-inflicted by current EA norms I would rather challenge instead.
Upvoted. No need to apologize because your criticism was valid on the basis of how I presented my case, which didn’t leave my main arguments particularly clear.
Yeah, I haven’t read the comments on Scott’s follow-up post yet because there were not many when I first noticed the post. I’m guessing there are more comments now and some themes among the reactions that may serve as an indicator as to the ways Scott’s post have led to more or less accurate understandings of EA.
I expect its ultimate impact will be closer to neutral than significantly positive or negative. I’m guessing that any downside of people put off by EA would only be a few people anyway. Public communication on this subject 3+ meta levels in might be intellectually interesting but in practice it’s too abstract to be high-stakes for EA.
Higher stakes, like the perception of a controversial topic in AI safety/alignment among social or professional networks adjacent to EA, might be a risk worthier of considering for a post like this. Scott himself has for years handled controversies in AI alignment like this better than most in the community. I’m more concerned about those in the community who aren’t as deft as Scott not being skillful enough to avoid the mistakes in public communication about EA he is relatively competent at avoiding.
I don’t have as strong a sense yet as to what exactly are the main causes of this but in general I get the same impression.
Interestingly, we might not disagree on very much after all. I probably did too much pattern matching between your writing and broader impressions I get about EA media strategies. Still, glad we got to chat it out!
Yeah, me too! Thanks for the conversation!