Vox has a new department, Future Perfect, covering the world from an effective altruist perspective

Today, Vox launched a new vertical, Future Perfect, that is dedicated to approaching the world from an effective altruist perspective. I am one of the new staff writers Vox hired to help make this happen, and I want to invite you all to check out the launch.

Future Perfect includes a podcast, guest contributions, and regular reporting from me, Dylan Matthews, and Abby Higgins on what we consider the most important issues of the day. The rest of our team is Engagement Manager Sammy Fries, and Elbert Ventura, senior policy editor at Vox.

I know that the question most EAs will be interested in is ‘what impacts can we expect this to have on the world?’ There are a bunch of angles that come to mind for me.

Vox has a huge existing audience sympathetic to EA causes. I think it’s plausible that the success of this vertical could raise awareness and encourage engagement with causes that EA really values, by getting the ideas of effective giving out to populations who are sympathetic to the ideas but haven’t been exposed to them.

There are a lot of failures of communication and information propagation within the EA movement. One thing I’ll be doing a lot of is writing up things that some people have known for years but that still are widely misunderstood. For example, I’ve talked with EAs who still don’t really understand the case for artificial intelligence risks, or who are unaware that the case for schistosomiasis treatments rests on long-term income effects rather than the limited short-term health ones, or who don’t know where the research stands on spillover effects of cash transfers. It seems valuable to have those things explained accessibly and accurately.

There’s a lot of discussion lately of how to provide more supports and engagement for involved EAs who are not in a position to do direct work, start a meetup group, or otherwise dive into higher levels of engagement. I think this vertical might do that, by providing lots of EA content that allows people to deepen their knowledge of a lot of EA conventional wisdom.

While Future Perfect will be aiming to cover every cause area that gets a lot of discussion within effective altruism, our balance of coverage is likely to be different from the cause prioritization of people currently heavily involved in EA. We are not representing EA, the movement, but are instead a project dedicated to looking at the world through an EA lens.

It seems plausible Future Perfect will increase the number of people exposed to some EA concepts and ideas without particularly increasing involvement in effective altruism, the movement. But it also seems plausible Future Perfect could lead to substantial growth of the EA movement. I’m interested in your thoughts on what to make of that, and I’ve heard a lot of different perspectives aired: On the one hand, more people means more talent, donations, energy, and influence; on the other hand, ideas might get watered down to be more palatable as they spread more widely. The best thing for an individual to do on the margin is very different for a movement of a few thousand people and for a movement of a few million, and if effective altruism grew too rapidly it might take a while for recommendations to catch up with the new best options, leading people to make unwise career choices.

Some launch-day pieces to check out:

Dylan Matthews writes about our vision for Future Perfect.

Ezra Klein’s interview with Bill Gates includes interesting insights into how Gates is thinking about global poverty, x-risk and animals.

Ron Klain on pandemic preparedness.

You can also see all our work on Vox’s Future Perfect page.