Thanks so much for taking the time to write this. As a man, the crux of my feeling disaffected from EA has been this part: “● Take up that humility more generally. Don’t judge that you’re right and another party is wrong before ensuring you know their reasoning — ask someone why they hold the position they do, maybe they’ve thought of something you haven’t just as you may be assuming you’ve thought of things they haven’t.” As a rule, I have found that EA people believe that they are the world leading expert on literally every single topic. A fellow, for example, said he was starting a blog and wanted submissions on certain topics which I have presented academic papers related to at international conferences. I offered to provide articles for his blog, free. He responded that he would have to see my previous work so he could review it. He had a Bachelors in computer programming and wanted to review my academic work in evolutionary psychology and political science that had been presented at leading international conferences. Because he is the leading expert in every single thing. Just this week I suggested that EA should focus more on threats to bees and other insect pollinators. Another fellow responded that all the claimed problems are false and the issue that does exist is easily solved. Amazing that he knows more than scores of professional entomologists publishing in peer reviewed journals, despite not being an entomologist or in science at all. But again, he is clearly the world leading expert in literally every topic.
At the same time, people like myself who have put serious effort and several decades into developing our knowledge on certain topics, and who have lengthy records of achieving altruistic results dating long before your group existed, find that our posts are not approved by the “moderators” of the FB main group, we are not invited to conferences, our views are not respected as a rule.
Whatever data and techniques you have, as currently constituted EA is more counter-effective than effective and on the road to irrelevance. However intelligent you are, being sure that you literally know everything makes you one of the dumbest people alive.
I think thoughtful, rationality-focused people (not just EA, but even, say, young software engineers) can often outperform the average ‘expert,’ with expertise measured by traditional credentials like having a PhD. There are many biases that pervade academia and other fields (e.g. publication bias, status quo bias, publish or perish incentives), and thoughtful people have often done a lot more than traditional experts to understand and overcome these biases. They also get the benefit of going into a field without as many preconceptions and personal investments, allowing them to better synthesize the literature in a less-biased way.
I don’t have many examples on hand (and would really like if someone else can provide them), but I feel there’s a solid track record of a thoughtful, rationality-focused person disagreeing strongly with traditional experts. Only two are coming to mind right now:
One is Eliezer Yudkowsky, a self-educated blogger, advocating for a focus on safety in the AI community that most traditional AI experts thought was crazy, but now the traditional AI community has shifted heavily towards Yudkowsky.
Another one is the Superforecasters discussed by Phil Tetlock doing very well at predicting future events (e.g. whether there will be a civil war in a certain country), despite traditional experts doing little better than chance.
The reason that we don’t blindly follow credentials is not that we think that we’re better than everyone else. It’s because we’re the first to approach new kinds of problems which haven’t been addressed before with a rigorous, interdisciplinary framework. When you have those goals and ingredients, new people can make progress just as well as traditional experts can.