I strongly agree with the points Ben Hoffman has been making (mostly in the other threads) about the epistemic problems caused by holding criticism to a higher standard than praise. I also think that we should be fairly mindful that providing public criticism can have a high social cost to the person making the criticism, even though they are providing a public service.
There are definitely ways that Sarah could have improved her post. But that is basically always going to be true of any blog post unless one spends 20+ hours writing it.
I personally have a number of criticisms of EA (despite overall being a strong proponent of the movement) that I am fairly unlikely to share publicly, due to the following dynamic: anything I write that wouldn’t incur unacceptably high social costs would have to be a highly watered-down version of the original point, and/or involve so much of my time to write carefully that it wouldn’t be worthwhile.
While I’m sympathetic to the fact that there’s also a lot of low-quality / lazy criticism of EA, I don’t think responses that involve setting a high bar for high-quality criticism are the right way to go.
(Note that I don’t think that EA is worse than is typical in terms of accepting criticism, though I do think that there are other groups / organizations that substantially outperform EA, which provides an existence proof that one can do much better.)
I strongly agree with the points Ben Hoffman has been making (mostly in the other threads) about the epistemic problems caused by holding criticism to a higher standard than praise. I also think that we should be fairly mindful that providing public criticism can have a high social cost to the person making the criticism, even though they are providing a public service.
This is completely true.
I personally have a number of criticisms of EA (despite overall being a strong proponent of the movement) that I am fairly unlikely to share publicly, due to the following dynamic: anything I write that wouldn’t incur unacceptably high social costs would have to be a highly watered-down version of the original point, and/or involve so much of my time to write carefully that it wouldn’t be worthwhile.
There are at least a dozen people for whom this is true.
I feel like this is true for me too. I’d guess I’ve got more spare time on my hands than you guys. I also don’t currently work for any EA charities. It’s really hard to make your beliefs pay rent when you’re in near mode and you’re constantly worried about how if you screw up a criticism you’ll lose connections and get ostracized, or you’ll hurt the trajectory of a cause or charity you like by association because as much as we like to say we’re debiased a lot of time affective rationalizations sneak into our motivations. Well, we all come from different walks of life, and a lot of us haven’t been in communities trying to be as intellectually honest and epistemically virtuous as EA tries to be. It’s hard to overcome that aversion to keeping our guard up because everywhere else we go in life our new ideas are treated utterly uncharitably, like, worse than anything in EA on a regular day. It’s hard to unlearn those patterns. We as a community need to find ways to trust each other more. But that takes a lot of work, and will take a while.
In the meantime, I don’t have a lot to lose by criticizing EA, or at least I can take a hit pretty well. I mean, maybe there are social opportunity costs, what I won’t be able to do in the future if I became low-status, but I’m confident I’m the sort of person who can create new opportunities for himself. So I’m not worried about me, and I don’t think anyone else should either. I’ve never had a cause selection. Honestly, it felt weird to talk about, but this whole model uncertainty thing people are going for between causes now is something I’ve implicitly grasped the whole time. Like, I never understood why everyone was so confident in their views on causes when a bunch of this stuff requires figuring out things about consciousness, or the value of future lives, which seem like some philosophically and historically mind-boggling puzzles to me.
If you go to my EAHub profile, you’ll notice the biggest donation I made was in 2014 for $1000 to Givewell for unrestricted funds. That was because I knew those funds would increase the pool of money for starting the Open Philanthropy Project. And it was matched. You’ll also notice I select pretty much every cause as something to consider, as I’m paranoid about myself or EA in general missing out on important information. All I can say about my politics is I’m a civil libertarian and otherwise I don’t get offended by reading things when they’re written by people who want to improve EA in earnest. I hope you’ll take my word that I didn’t just edit my EA Hub profile now. That’s what I got for a badge to show I really try to stay neutral.
If anyone wants to privately and/or anonymously send me their thoughts on an EA organization, and what they’re doing wrong, no matter what it is, I’ll give my honest feedback and we can have a back and forth and hopefully hammer something out to be published. I also don’t particularly favour any EA org right now as I feel like a lot of these organizations are people who’ve only been in academia, or the software industry, or are sometimes starting non-profits right out of college, who might just not have the type or diversity of experience to alone make good plans/models, or get skills for dealing with different types of people and getting things done. I’ve thought for a while all these organizations at different points have made little or big mistakes, which are really hard to talk about in public, and it feels a bit absurd to me they’re never talked about.
Feel free to send me stuff. Please don’t send me stuff about interpersonal drama. Treat what you send me like filing a bug report.
though I do think that there are other groups / organizations that substantially outperform EA, which provides an existence proof that one can do much better
Interesting. Which groups could we learn the most from?
I think parts of academia do this well (although other parts do it poorly, and I think it’s been getting worse over time). In particular, if you present ideas at a seminar, essentially arbitrarily harsh criticism is fair game. Of course, this is different from the public internet, but it’s still a group of people, many of whom do not know each other personally, where pretty strong criticism is the norm.
My impression is that criticism has traditionally been a strong part of Jewish culture, but I’m not culturally Jewish so can’t speak directly.
I heard that Bridgewater did a bunch of stuff related to feedback/criticism but again don’t know a ton about it.
Of course, none of these examples address the fact that much of the criticism of EA happens over the internet, but I do feel that some of the barriers to criticism online also carry over in person (though others don’t).
I think parts of academia do this well (although other parts do it poorly, and I think it’s been getting worse over time). In particular, if you present ideas at a seminar, essentially arbitrarily harsh criticism is fair game. Of course, this is different from the public internet, but it’s still a group of people, many of whom do not know each other personally, where pretty strong criticism is the norm.
One guess is that ritualization in academia helps with this—if you say something in a talk or paper, you ritually invite criticism, whereas I’d be surprised to see people apply the same norms to e.g. a prominent researcher posting on facebook. (Maybe they should apply those norms, but I’d guess they don’t.)
Unfortunately, it’s not obvious how to get the same benefits in EA.
I’m surprised to hear that people see criticizing EA as incurring social costs. My impression was that many past criticisms of EA have been met with significant praise (e.g., Ben Kuhn’s). One approach for dealing with this could be to provide a forum for anonymous posts + comments.
I think it really depends on who you criticize. I perceive criticizing particular people or organizations as having significant social costs (though I’m not saying whether those costs are merited or not).
anything I write that wouldn’t incur unacceptably high social costs would have to be a highly watered-down version of the original point, and/or involve so much of my time to write carefully that it wouldn’t be worthwhile.
I think I tend to expect more from people when they are critical—i.e. I’m fine with a compliment/agreement that someone spent 2 minutes on, but expect critics to “do their homework”, and if a complimenter and a critic were equally underinformed/unthoughtful, I’d judge the critic more harshly. This seems bad!
One response is “poorly thought-through criticism can spread through networks; even if it’s responded to in one place, people cache and repeat it other places where it’s not responded to, and that’s harmful.” This applies equally well to poorly thought-through compliments; maybe the unchallenged-compliment problem is even worse, because I have warm feelings about this community and its people and orgs!
Proposed responses (for me, though others could adopt them if they thought they’re good ideas):
For now, assume that all critics are in good faith. (If we have / end up with a bad-critic problem, these responses need to be revised; I’ll assume for now that the asymmetry of critique is a bigger problem.)
When responding to critiques, thank the critic in a sincere, non-fake way, especially when I disagree with the critique (e.g. “Though I’m about to respond with how I disagree, I appreciate you taking the critic’s risk to help the community. Thank you! [response to critique]”)
Agree or disagree with critiques in a straightforward way, instead of saying e.g. “you should have thought about this harder”.
Couch compliments the way I would couch critiques.
Try to notice my disagreements with compliments, and comment on them if I disagree.
“Though I’m about to respond with how I disagree, I appreciate you taking the critic’s risk to help the community. Thank you!”
Not sure how much this helps because if the criticism is thoughtful and you fail to engage with it, you’re still being rude and missing an opportunity, whether or not you say some magic words.
I strongly agree with the points Ben Hoffman has been making (mostly in the other threads) about the epistemic problems caused by holding criticism to a higher standard than praise. I also think that we should be fairly mindful that providing public criticism can have a high social cost to the person making the criticism, even though they are providing a public service.
There are definitely ways that Sarah could have improved her post. But that is basically always going to be true of any blog post unless one spends 20+ hours writing it.
I personally have a number of criticisms of EA (despite overall being a strong proponent of the movement) that I am fairly unlikely to share publicly, due to the following dynamic: anything I write that wouldn’t incur unacceptably high social costs would have to be a highly watered-down version of the original point, and/or involve so much of my time to write carefully that it wouldn’t be worthwhile.
While I’m sympathetic to the fact that there’s also a lot of low-quality / lazy criticism of EA, I don’t think responses that involve setting a high bar for high-quality criticism are the right way to go.
(Note that I don’t think that EA is worse than is typical in terms of accepting criticism, though I do think that there are other groups / organizations that substantially outperform EA, which provides an existence proof that one can do much better.)
This is completely true.
There are at least a dozen people for whom this is true.
I feel like this is true for me too. I’d guess I’ve got more spare time on my hands than you guys. I also don’t currently work for any EA charities. It’s really hard to make your beliefs pay rent when you’re in near mode and you’re constantly worried about how if you screw up a criticism you’ll lose connections and get ostracized, or you’ll hurt the trajectory of a cause or charity you like by association because as much as we like to say we’re debiased a lot of time affective rationalizations sneak into our motivations. Well, we all come from different walks of life, and a lot of us haven’t been in communities trying to be as intellectually honest and epistemically virtuous as EA tries to be. It’s hard to overcome that aversion to keeping our guard up because everywhere else we go in life our new ideas are treated utterly uncharitably, like, worse than anything in EA on a regular day. It’s hard to unlearn those patterns. We as a community need to find ways to trust each other more. But that takes a lot of work, and will take a while.
In the meantime, I don’t have a lot to lose by criticizing EA, or at least I can take a hit pretty well. I mean, maybe there are social opportunity costs, what I won’t be able to do in the future if I became low-status, but I’m confident I’m the sort of person who can create new opportunities for himself. So I’m not worried about me, and I don’t think anyone else should either. I’ve never had a cause selection. Honestly, it felt weird to talk about, but this whole model uncertainty thing people are going for between causes now is something I’ve implicitly grasped the whole time. Like, I never understood why everyone was so confident in their views on causes when a bunch of this stuff requires figuring out things about consciousness, or the value of future lives, which seem like some philosophically and historically mind-boggling puzzles to me.
If you go to my EAHub profile, you’ll notice the biggest donation I made was in 2014 for $1000 to Givewell for unrestricted funds. That was because I knew those funds would increase the pool of money for starting the Open Philanthropy Project. And it was matched. You’ll also notice I select pretty much every cause as something to consider, as I’m paranoid about myself or EA in general missing out on important information. All I can say about my politics is I’m a civil libertarian and otherwise I don’t get offended by reading things when they’re written by people who want to improve EA in earnest. I hope you’ll take my word that I didn’t just edit my EA Hub profile now. That’s what I got for a badge to show I really try to stay neutral.
If anyone wants to privately and/or anonymously send me their thoughts on an EA organization, and what they’re doing wrong, no matter what it is, I’ll give my honest feedback and we can have a back and forth and hopefully hammer something out to be published. I also don’t particularly favour any EA org right now as I feel like a lot of these organizations are people who’ve only been in academia, or the software industry, or are sometimes starting non-profits right out of college, who might just not have the type or diversity of experience to alone make good plans/models, or get skills for dealing with different types of people and getting things done. I’ve thought for a while all these organizations at different points have made little or big mistakes, which are really hard to talk about in public, and it feels a bit absurd to me they’re never talked about.
Feel free to send me stuff. Please don’t send me stuff about interpersonal drama. Treat what you send me like filing a bug report.
Interesting. Which groups could we learn the most from?
I think parts of academia do this well (although other parts do it poorly, and I think it’s been getting worse over time). In particular, if you present ideas at a seminar, essentially arbitrarily harsh criticism is fair game. Of course, this is different from the public internet, but it’s still a group of people, many of whom do not know each other personally, where pretty strong criticism is the norm.
My impression is that criticism has traditionally been a strong part of Jewish culture, but I’m not culturally Jewish so can’t speak directly.
I heard that Bridgewater did a bunch of stuff related to feedback/criticism but again don’t know a ton about it.
Of course, none of these examples address the fact that much of the criticism of EA happens over the internet, but I do feel that some of the barriers to criticism online also carry over in person (though others don’t).
Thanks!
One guess is that ritualization in academia helps with this—if you say something in a talk or paper, you ritually invite criticism, whereas I’d be surprised to see people apply the same norms to e.g. a prominent researcher posting on facebook. (Maybe they should apply those norms, but I’d guess they don’t.)
Unfortunately, it’s not obvious how to get the same benefits in EA.
I’m surprised to hear that people see criticizing EA as incurring social costs. My impression was that many past criticisms of EA have been met with significant praise (e.g., Ben Kuhn’s). One approach for dealing with this could be to provide a forum for anonymous posts + comments.
I think it really depends on who you criticize. I perceive criticizing particular people or organizations as having significant social costs (though I’m not saying whether those costs are merited or not).
In my post, I said
I would expect that conditioned on spending a large amount of time to write the criticism carefully, it would be met with significant praise. (This is backed up at least in upvotes by past examples of my own writing, e.g. Another Critique of Effective Altruism, The Power of Noise, and A Fervent Defense of Frequentist Statistics.)
This is a great point—thanks, Jacob!
I think I tend to expect more from people when they are critical—i.e. I’m fine with a compliment/agreement that someone spent 2 minutes on, but expect critics to “do their homework”, and if a complimenter and a critic were equally underinformed/unthoughtful, I’d judge the critic more harshly. This seems bad!
One response is “poorly thought-through criticism can spread through networks; even if it’s responded to in one place, people cache and repeat it other places where it’s not responded to, and that’s harmful.” This applies equally well to poorly thought-through compliments; maybe the unchallenged-compliment problem is even worse, because I have warm feelings about this community and its people and orgs!
Proposed responses (for me, though others could adopt them if they thought they’re good ideas):
For now, assume that all critics are in good faith. (If we have / end up with a bad-critic problem, these responses need to be revised; I’ll assume for now that the asymmetry of critique is a bigger problem.)
When responding to critiques, thank the critic in a sincere, non-fake way, especially when I disagree with the critique (e.g. “Though I’m about to respond with how I disagree, I appreciate you taking the critic’s risk to help the community. Thank you! [response to critique]”)
Agree or disagree with critiques in a straightforward way, instead of saying e.g. “you should have thought about this harder”.
Couch compliments the way I would couch critiques.
Try to notice my disagreements with compliments, and comment on them if I disagree.
Thoughts?
Not sure how much this helps because if the criticism is thoughtful and you fail to engage with it, you’re still being rude and missing an opportunity, whether or not you say some magic words.
I agree that if engagement with the critique doesn’t follow those words, they’re not helpful :) Editing my post to clarify that.