greater confidence in EEV lends itself to supporting longshots to reduce x-risk or otherwise seek to improve the long-term future in a highly targeted, deliberate way.
This just depends on what you think those EEVs are. Long-serving EAs tend to lean towards thinking that targeted efforts towards the far future have higher payoff, but that also has a strong selection effect. I know many smart people with totalising consequentialist sympathies who are sceptical enough of the far future that they prefer to donate to GHD causes. None of them are at all active in the EA movement, and I don’t think that’s coincidence.
This just depends on what you think those EEVs are. Long-serving EAs tend to lean towards thinking that targeted efforts towards the far future have higher payoff, but that also has a strong selection effect. I know many smart people with totalising consequentialist sympathies who are sceptical enough of the far future that they prefer to donate to GHD causes. None of them are at all active in the EA movement, and I don’t think that’s coincidence.