Really valuable post, particularly because EA should be paying more attention to Future Perfect—it’s some of EA’s biggest mainstream exposure. Some thoughts in different threads:
1. Writing for a general audience is really hard, and I don’t think we can expect Vox to maintain the fidelity standards EA is used to. It has to be entertaining, every article has to be accessible to new readers (meaning you can’t build up reader expecations over time, like a sequence of blog posts or book would), and Vox has to write for the audience they have rather than wait for the audience we’d like.
In that light, look at, say, the baby Hitler article. It has to be connected to the average Vox reader’s existing interests, hence the Ben Shapiro intro. It has to be entertaining, so Matthew’s digresses onto time travel and Matrix. Then it has to provide valuable information content: an intro to moral cluelessness and expected value.
It’s pretty tough for one article to do all that, AND seriously critique Great Man history, AND explain the history of the Nazi Party. To me, dropping those isn’t shoddy journalism, it’s valuable insight into how to engage your readers, not the ideal reader.
Bottom line: People who took the 2018 EA Survey are twice more likely than the average American to hold a bachelor’s degree, and 7x more likely to hold a Ph.D. That’s why Robin Hanson and GiveWell have been great reading resources so far. But if we actually want EA to go mainstream, we can’t rely on econbloggers and think-tanks to reach most people. We need easier explanations, and I think Vox provides that well.
...
(P.S. Small matter, Matthews does not say that it’s “totally impossible” to act in the face of cluelessness, unlike what you implied—he says the opposite. And then: “If we know the near-term effects of foiling a nuclear terrorism plot are that millions of people don’t die, and don’t know what the long-term effects will be, that’s still a good reason to foil the plot.” That’s a great informal explanation. Edit to correct that?)
But if we actually want EA to go mainstream, we can’t rely on econbloggers and think-tanks to reach most people. We need easier explanations, and I think Vox provides that well.
Is “taking EA mainstream” the best thing for Future Perfect to try & accomplish? Our goal as a movement is not to maximize the people of number who have the “EA” label. See Goodhart’s Law. Our goal is to do the most good. If we garble the ideas or epistemology of EA in an effort to maximize the number of people who have the “EA” self-label, this seems like it’s potentially an example of Goodhart’s Law.
Instead of “taking EA mainstream”, how about “spread memes to Vox’s audience that will cause people in that audience to have a greater positive impact on the world”?
Agreed. If you accept the premise that EA should enter popular discourse, most generally informed people should be aware of it, etc., then I think you should like Vox. But if you think EA should be a small elite academic group, not a mass movement, that’s another discussion entirely, and maybe you shouldn’t like Vox.
I was referring to the impossibility of weighing highly uncertain possibilities against each other.
It’s pretty tough for one article to do all that, AND seriously critique Great Man history, AND explain the history of the Nazi Party.
Well you don’t have to explain it… you just have to not contradict them.
Bottom line: People who took the 2018 EA Survey are twice more likely than the average American to hold a bachelor’s degree, and 7x more likely to hold a Ph.D. That’s why Robin Hanson and GiveWell have been great reading resources so far. But if we actually want EA to go mainstream, we can’t rely on econbloggers and think-tanks to reach most people. We need easier explanations, and I think Vox provides that well.
But when looking through most these articles, I don’t see plausible routes to growing the EA movement. Some of them talk about things like Givewell charities and high-impact actions people can take, occasionally they mention the EA community itself. But many of them, especially these political ones, have no connection. As you say—this isn’t a sequence of blog posts that someone is going to follow over time. They’ll read the article, see an argument about marijuana or whatever that happens to be framed in a nicely consequentialist sort of manner, and then move on.
This is kind of worrying to me—I normally think of Vox as a rational ideological outlet that knows how to produce effective media, often misleading, in order to canvass support for its favored causes. Yet they don’t seem to be applying much of this capacity towards bolstering the EA movement itself, which suggests that it’s really not a priority for them, compared to their favored political issues.
Great discussion here. I’m trying to imagine how most people consume these articles. Linked from the Vox home page? Shared on Facebook or Twitter? Do they realize they aren’t just a standard Vox article? Some probably barely know what Vox is. Certainly, we are all aware of the connection to EA, but I bet most readers are pretty oblivious.
In that case, maybe these tangentially related or unrelated articles don’t matter too much. Conversely, the better articles may spark an interest that leads a few people towards finding out more about EA and becoming involved.
Really valuable post, particularly because EA should be paying more attention to Future Perfect—it’s some of EA’s biggest mainstream exposure. Some thoughts in different threads:
1. Writing for a general audience is really hard, and I don’t think we can expect Vox to maintain the fidelity standards EA is used to. It has to be entertaining, every article has to be accessible to new readers (meaning you can’t build up reader expecations over time, like a sequence of blog posts or book would), and Vox has to write for the audience they have rather than wait for the audience we’d like.
In that light, look at, say, the baby Hitler article. It has to be connected to the average Vox reader’s existing interests, hence the Ben Shapiro intro. It has to be entertaining, so Matthew’s digresses onto time travel and Matrix. Then it has to provide valuable information content: an intro to moral cluelessness and expected value.
It’s pretty tough for one article to do all that, AND seriously critique Great Man history, AND explain the history of the Nazi Party. To me, dropping those isn’t shoddy journalism, it’s valuable insight into how to engage your readers, not the ideal reader.
Bottom line: People who took the 2018 EA Survey are twice more likely than the average American to hold a bachelor’s degree, and 7x more likely to hold a Ph.D. That’s why Robin Hanson and GiveWell have been great reading resources so far. But if we actually want EA to go mainstream, we can’t rely on econbloggers and think-tanks to reach most people. We need easier explanations, and I think Vox provides that well.
...
(P.S. Small matter, Matthews does not say that it’s “totally impossible” to act in the face of cluelessness, unlike what you implied—he says the opposite. And then: “If we know the near-term effects of foiling a nuclear terrorism plot are that millions of people don’t die, and don’t know what the long-term effects will be, that’s still a good reason to foil the plot.” That’s a great informal explanation. Edit to correct that?)
Is “taking EA mainstream” the best thing for Future Perfect to try & accomplish? Our goal as a movement is not to maximize the people of number who have the “EA” label. See Goodhart’s Law. Our goal is to do the most good. If we garble the ideas or epistemology of EA in an effort to maximize the number of people who have the “EA” self-label, this seems like it’s potentially an example of Goodhart’s Law.
Instead of “taking EA mainstream”, how about “spread memes to Vox’s audience that will cause people in that audience to have a greater positive impact on the world”?
Agreed. If you accept the premise that EA should enter popular discourse, most generally informed people should be aware of it, etc., then I think you should like Vox. But if you think EA should be a small elite academic group, not a mass movement, that’s another discussion entirely, and maybe you shouldn’t like Vox.
I was referring to the impossibility of weighing highly uncertain possibilities against each other.
Well you don’t have to explain it… you just have to not contradict them.
But when looking through most these articles, I don’t see plausible routes to growing the EA movement. Some of them talk about things like Givewell charities and high-impact actions people can take, occasionally they mention the EA community itself. But many of them, especially these political ones, have no connection. As you say—this isn’t a sequence of blog posts that someone is going to follow over time. They’ll read the article, see an argument about marijuana or whatever that happens to be framed in a nicely consequentialist sort of manner, and then move on.
This is kind of worrying to me—I normally think of Vox as a rational ideological outlet that knows how to produce effective media, often misleading, in order to canvass support for its favored causes. Yet they don’t seem to be applying much of this capacity towards bolstering the EA movement itself, which suggests that it’s really not a priority for them, compared to their favored political issues.
Great discussion here. I’m trying to imagine how most people consume these articles. Linked from the Vox home page? Shared on Facebook or Twitter? Do they realize they aren’t just a standard Vox article? Some probably barely know what Vox is. Certainly, we are all aware of the connection to EA, but I bet most readers are pretty oblivious.
In that case, maybe these tangentially related or unrelated articles don’t matter too much. Conversely, the better articles may spark an interest that leads a few people towards finding out more about EA and becoming involved.