“This very circumstance is not only true “here”, but moreover with public discourse in general, making it clear that we as humans continue to have grave epistemic issues, not only in terms of recognizing that an issue exists but in treating it with the appropriate attention.”
Issues of tact aren’t relevant to your argument, but I would really recommend keeping stuff like this out of pieces aiming to convince people of cause areas. Anything extra that assumes your conclusion is true and then goes after people for not believing it is inevitably irritating, and besides, not actually relevant to the point that you’re trying to prove.
I also generally think it’s more productive when you have that feeling of shock at people’s views to have the first impulse be to look to learn about why people believe what they do, rather than to try to lay out your case and see what comes back. Even if you’re completely unconvinced by what people have to say about it, you’ll end up in a position where you can advocate for your views in a way that’s most likely to connect with the way that others are thinking.
This is well done! Acknowledging and talking about what makes hyper-rationalism repulsive to many people—mostly very unfairly! - is constructive and interesting.
Maybe out of scope, but in the introduction section describing EA, I’d probably also include a slide or two of some of the more reasonable criticisms of typical EA beliefs and behaviors as well, and separate those from the list of 10 barriers of bias and irrational intuition.
Doing that would better set aside the question of the merits of the EA approach, and make it easier to focus on these other blockers to wider adoption. It also would make the presentation come off more even-handed rather than “here are the bad reasons people don’t support what I support”. That might get you more buy-in from the more skeptical members of the audience, along with inducing some questioning about how to improve EA from people who do find the answers intuitive.