I disagree with this disagreement.
EA is built on a foundation of rejecting the status quo. EA might only do that in places where the status quo is woefully inadequate of false in some way, but the status quo is still the status quo and it will strike back at people who challenge it.
The phenomenon described above is a side effect of optimization, not “contrarian bias”. Contrarian bias is also a problem that many people in EA and especially rationalists have, but the only common factor is that there aren’t the kind of people who assume that everything is all right and go along with it.
I disagree with your disagreement of my disagreement!
The foundation of EA is (or at least should be), finding the truth. We should only reject the status quo if the status quo is wrong.
I don’t have a problem with EA trying out hot takes and contrarian ideas, because finding cases where the status quo is genuinely wrong is valuable and gives a large competitive advantage. But I think this very fact leads to a bias towards accepting such ideas, even if they are not strictly true.