If everyone makes the same criticism, the opposite criticism is more likely to be true

  • I frequently hear people say EAs rely too much on quantifying uncertain variables, but I basically never hear the opposite criticism. If everyone believes you shouldn’t quantify, then nobody’s doing it, so it can’t possibly be true that people quantify too much, and in fact the opposite is probably true.

  • Obviously I could make various counterarguments, like maybe the people who think we don’t quantify enough are not writing essays about how we need to quantify more. Generally speaking, I don’t think this counterargument is correct, but arguing for/​against it is harder so I don’t have much to say about it

  • It’s like Lake Wobegon, where all the children are above average. It’s impossible for every single person in the community to believe that the community is not X enough

  • Another example: everyone says we need to care more about systemic change

  • Saw a Twitter post “EAs way under-update on thought experiments” and I thought, damn that’s a spicy take. Then I realized I misread it and they actually said “over-update” and I thought...wow what a boring take that’s been said a thousand times already

    • They gave the simulation argument and Roko’s Basilisk as examples. As far as I know, nobody has ever changed their behavior based on either of those arguments. It would be pretty much impossible for people to update less on them than they have

      • I’m sure there are some people somewhere who have updated based on the simulation argument but I’ve never met them

    • “People under-update on thought experiments” would have been a much more interesting take because people basically don’t update on thought experiments

  • By a shocking coincidence, I take the opposite side on all these examples: I think EAs should use more quantitative estimates, should care less about systemic change, and should update more on thought experiments

    • Are there any issues where I make the same criticism as everyone else, and I’m actually wrong? Probably, idk

  • I can think of some non-EA-related examples of this phenomenon, but I’m not as interested in those

  • By analogy, the moment when the most people agree the stock market is going to go up is the exact moment when the market is at its peak. The price can’t go higher because there’s no one left to buy from. If everyone agrees, everyone /​must/​ be wrong

Relevant Scott Alexander: https://​​slatestarcodex.com/​​2014/​​03/​​24/​​should-you-reverse-any-advice-you-hear/​​ and https://​​slatestarcodex.com/​​2017/​​04/​​07/​​yes-we-have-noticed-the-skulls/​​. He said it better than me, but my post isn’t about exactly the same thing so I figured it might be worth publishing.

(Note: The way I usually write essays is by writing outlines like this, and then fleshing them out into full posts. For a lot of the outlines I write, like this one, I never flesh them out because it doesn’t seem worth the time. But I figured for Draft Amnesty Day, I could just publish my outline, and most people will get the idea.)