I’m sympathetic, but to make the counterpoint: EA needs some way to protect against bullshit.
Scientists gatekeep publication behind peer review. Wikipedia requires that every claim be backed up with a source. Journalists employ fact checkers. None of these are in any way perfect (and are often deeply flawed), but the point is that theoretically, at least one qualified person with expertise in the subject has checked over what has been written for errors.
In contrast, how does EA ensure that the claims made here are actually accurate? Well, we first hope that people are honest and get everything right initially, but of course that can never ensure anything. The main mechanism relied upon is that some random reader will bother to read an article closely enough to spot errors in it, and then write a criticism calling the error out in the comments, or write up their own post calling out said error. Of course this is sometimes acrimonious. But if we don’t put up the criticism, the BS claims will cement themselves, and start influencing actual real world decisions that affect millions of dollars and peoples lives.
If we stop “sanctifying” criticism, then what exactly is stopping BS from taking over the entire movement (if it hasn’t already)? I’ve certainly seen actually good criticism dismissed as bad criticism because the author misunderstood their critique, or differed in assumptions. If you’re going to rely on criticism as the imperfect hammer to root out bullshit nails, you kinda have to give it a special place.
I don’t see how asking for higher standards for criticism makes EA defenseless against “bullshit.”
I actually would argue the opposite: if we keep encouraging and incentivizing any kind of criticism, and tolerate needlessly acrimonious personal attacks, we end up in an environment where nobody proposes anything besides the status quo, and the status quo becomes increasingly less transparent.
@Happier Lives Institute stopped writing their criticism of/alternatives to GiveWell in part because some criticism they got was largely personal attacks
I think Holly_Elmore herself is another example: she used to write posts like “We are in triage every second of every day”, which I think are very useful to make EA less “bullshit”, but now mostly doesn’t post on this forum, partly because of the bad quality costly criticism she receives.
I think the average comment is just a bit less argumentative / critical than would be ideal.
I think the average critical comment is less kind than would be ideal.
I want criticism to be kind, but I also want it to exist, and pushing people to be kinder might also reduce the overall quantity of criticism. I’m not sure what the best realistic outcome is.
I personally fear that the current discussion environment on this forum errs too much in the “unkind criticism” direction, and I see at least two large downsides:
It encourages some form of “enlightened immobilism”, where anyone proposing doing anything differently from the status quo gets instantly shut down.
It strongly discourages transparency from most projects, especially (but not exclusively) less established or more speculative ones.
I believe that a culture of more constructive and higher-quality criticism would encourage more discussion overall, not less, especially from experienced professionals who have different perspectives from mainline EA thinking.
It encourages some form of “enlightened immobilism”, where anyone proposing doing anything differently from the status quo gets instantly shut down.
I think telling people to critique less is a suboptimal solution for this. At least in theory, it’s more ideal for people to be willing to do things despite getting critiques.
Someone can write a critique for anything. Instead of checking if there’s a critique, you could check “does a neutral party think this critique is stronger than average for a random EA project” or something like that. (If the project is weaker than average in light of the critique, that suggests resources should perhaps be reallocated.)
Downside risk is everywhere, and its mere existence shouldn’t be sufficient to cause inaction.
I think it’s important to exercise judgment here. Many (e.g. political) communities reflexively dismiss criticism, in ways that are epistemically irresponsible and lead them into pathology. It’s important to be more open-minded than that, and to have a general stance of being open to the possibility that one is wrong.
But there’s also a very real risk (sometimes realized, IMO, on this forum) of people going too far in the opposite direction and reflexivelyaccept criticism as apt or reasonable when it plainly isn’t. (This can take the form of downvoting or pushback against those of us who explain why a criticism is bad/unreasonable.) Sometimes people enact a sort of performative open-mindedness which calls on them to welcome anti-EA criticism and reject criticism of that criticism more or less independently of the actual content or its merits. I find that very annoying.
(An example: when I first shared my ‘Why Not Effective Altruism?’ draft, the feedback here seemed extremely negative and discouraging—some even accused me of bad faith! -- because people didn’t like that I was criticizing the critics of EA. Now that it’s published, many seem to appreciate the paper and agree that it’s helpful. shrug.)
My sense is that this problem isn’t as bad now as in early 2023 when EA was going through a ridiculous self-flagellation phase.
What you call performative open-mindedness (I have been internally referring to it as epistemic virtue signaling) is a very real and important phenomenon, and one I wish people wrote about more and were more aware of.
I’m not sure if this is the same phenomenon, or a different phenomenon that uses the same word, but I see calls for “open-mindedness” in the woo community. When expressing my disbelief in ghosts/astrology/ESP/etc., I’m told I “need to be more open-minded”.
Maybe I spoke too soon: it “seems unfair” to characterize Wenar’s WIRED article as “discouraging life-saving aid”? (A comment that is immediately met with two agree votes!) The pathology lives on.
I read your post as saying we need some level of criticism, and I agree. I understand what the OP is saying as a reaction to the current temperature in these online communities like EA, and arguing right now we have too much criticism and not enough doing. Which is because criticism is much cheaper than doing.
If you wanted to effectively take the other side, you’d need to quantify the current level of criticism and its chilling effect and argue for turning the dial up or down.
Which of course is very hard, that’s why criticizing is so much easier than doing.
I’m sympathetic, but to make the counterpoint: EA needs some way to protect against bullshit.
Scientists gatekeep publication behind peer review. Wikipedia requires that every claim be backed up with a source. Journalists employ fact checkers. None of these are in any way perfect (and are often deeply flawed), but the point is that theoretically, at least one qualified person with expertise in the subject has checked over what has been written for errors.
In contrast, how does EA ensure that the claims made here are actually accurate? Well, we first hope that people are honest and get everything right initially, but of course that can never ensure anything. The main mechanism relied upon is that some random reader will bother to read an article closely enough to spot errors in it, and then write a criticism calling the error out in the comments, or write up their own post calling out said error. Of course this is sometimes acrimonious. But if we don’t put up the criticism, the BS claims will cement themselves, and start influencing actual real world decisions that affect millions of dollars and peoples lives.
If we stop “sanctifying” criticism, then what exactly is stopping BS from taking over the entire movement (if it hasn’t already)? I’ve certainly seen actually good criticism dismissed as bad criticism because the author misunderstood their critique, or differed in assumptions. If you’re going to rely on criticism as the imperfect hammer to root out bullshit nails, you kinda have to give it a special place.
I don’t see how asking for higher standards for criticism makes EA defenseless against “bullshit.”
I actually would argue the opposite: if we keep encouraging and incentivizing any kind of criticism, and tolerate needlessly acrimonious personal attacks, we end up in an environment where nobody proposes anything besides the status quo, and the status quo becomes increasingly less transparent.
Three recent examples that come to mind:
User @Omega stopped writing their criticism of AI Safety organizations partly because of how the criticism of their criticism assumed bad faith
@Happier Lives Institute stopped writing their criticism of/alternatives to GiveWell in part because some criticism they got was largely personal attacks
@Dustin Moskovitz claims that they[1] write less on this forum because of attacks from EAs. He also wrote the attitude in EA communities is “give an inch, fight a mile”. So I’ll choose to be less legible instead.
I think Holly_Elmore herself is another example: she used to write posts like “We are in triage every second of every day”, which I think are very useful to make EA less “bullshit”, but now mostly doesn’t post on this forum, partly because of the bad quality costly criticism she receives.
I largely agree with the last section of this comment from Aaron Gertler written one year ago:
I personally fear that the current discussion environment on this forum errs too much in the “unkind criticism” direction, and I see at least two large downsides:
It encourages some form of “enlightened immobilism”, where anyone proposing doing anything differently from the status quo gets instantly shut down.
It strongly discourages transparency from most projects, especially (but not exclusively) less established or more speculative ones.
I used to think that accepting callousness was required to have technical excellence, e.g. reading how people like famous software engineer Linus Torvalds used to communicate. After seeing many extremely competent people communicate criticism in a professional and constructive manner, I have completely changed my mind. Torvalds also apologised and changed communication style years ago.
I believe that a culture of more constructive and higher-quality criticism would encourage more discussion overall, not less, especially from experienced professionals who have different perspectives from mainline EA thinking.
See also this paragraph from the Charity Entrepreneurship handbook:
Writing as myself, not as a moderator
I’m not sure if he meant Good Ventures, Open Philanthropy, or some other group
I think telling people to critique less is a suboptimal solution for this. At least in theory, it’s more ideal for people to be willing to do things despite getting critiques.
Someone can write a critique for anything. Instead of checking if there’s a critique, you could check “does a neutral party think this critique is stronger than average for a random EA project” or something like that. (If the project is weaker than average in light of the critique, that suggests resources should perhaps be reallocated.)
Downside risk is everywhere, and its mere existence shouldn’t be sufficient to cause inaction.
I think it’s important to exercise judgment here. Many (e.g. political) communities reflexively dismiss criticism, in ways that are epistemically irresponsible and lead them into pathology. It’s important to be more open-minded than that, and to have a general stance of being open to the possibility that one is wrong.
But there’s also a very real risk (sometimes realized, IMO, on this forum) of people going too far in the opposite direction and reflexively accept criticism as apt or reasonable when it plainly isn’t. (This can take the form of downvoting or pushback against those of us who explain why a criticism is bad/unreasonable.) Sometimes people enact a sort of performative open-mindedness which calls on them to welcome anti-EA criticism and reject criticism of that criticism more or less independently of the actual content or its merits. I find that very annoying.
(An example: when I first shared my ‘Why Not Effective Altruism?’ draft, the feedback here seemed extremely negative and discouraging—some even accused me of bad faith! -- because people didn’t like that I was criticizing the critics of EA. Now that it’s published, many seem to appreciate the paper and agree that it’s helpful. shrug.)
My sense is that this problem isn’t as bad now as in early 2023 when EA was going through a ridiculous self-flagellation phase.
What you call performative open-mindedness (I have been internally referring to it as epistemic virtue signaling) is a very real and important phenomenon, and one I wish people wrote about more and were more aware of.
I’m not sure if this is the same phenomenon, or a different phenomenon that uses the same word, but I see calls for “open-mindedness” in the woo community. When expressing my disbelief in ghosts/astrology/ESP/etc., I’m told I “need to be more open-minded”.
Maybe I spoke too soon: it “seems unfair” to characterize Wenar’s WIRED article as “discouraging life-saving aid”? (A comment that is immediately met with two agree votes!) The pathology lives on.
I read your post as saying we need some level of criticism, and I agree. I understand what the OP is saying as a reaction to the current temperature in these online communities like EA, and arguing right now we have too much criticism and not enough doing. Which is because criticism is much cheaper than doing.
If you wanted to effectively take the other side, you’d need to quantify the current level of criticism and its chilling effect and argue for turning the dial up or down.
Which of course is very hard, that’s why criticizing is so much easier than doing.