3. I have no personal or inside info on Future Perfect, Vox, Dylan Matthews, Ezra Klein, etc. But it seems like they’ve got a fair bit of respect for the EA movement—they actually care about impact, and they’re not trying to discredit or overtake more traditional EA figureheads like MacAskill and Singer.
Therefore I think we should be very respectful towards Vox, and treat them like ingroup members. We have great norms in the EA blogosphere about epistemic modesty, avoiding ad hominem attacks, viewing opposition charitably, etc. that allow us to have much more productive discussions. I think we can extend that relationship to Vox.
Using this piece as an example, if you were criticizing Rob Wiblin’s podcasting instead of Vox’s writing, I think people might ask you to be more charitable. We’re not anti-criticism—We’re absolutely committed to truth and honesty, which means seeking good criticism—but we also have well-justified trust in the community. We share a common goal, and that makes it really easy to cooperate.
Let’s trust Vox like that. It’ll make our cooperation more effective, we can help each other achieve our common goal, and, if necessary, we can always take back our trust later.
Update: Upon further evidence I’ve decided that this model is more wrong than right, specifically 2 is not really the case, it is more a matter of implicit bias among their staff, they more genuine and less Machiavellian than I thought. So cooperation is easier.
My mental model is this -
1. Vox has produced flawed and systematically biased media for years. (evidence: their previous political reporting and the various criticisms it’s received, similar to Breitbart for instance)
2. Vox knows that they have produced flawed and systematically biased media for years, but continues doing it anyway because it maximizes their revenue and it furthers their political goals. (evidence: they’re not idiots, and the criticisms seem sound, and they do not retract or walk back from their contested media)
3. If Vox cared significantly about the EA movement, they wouldn’t produce flawed and systematically biased media in the EA column.
For this reason I do not give them the benefit of the doubt, though I’m aware that enough messy background and priors are involved to make this disagreement difficult to resolve in any concise conversation.
I agree with the other respondent that Dylan Matthews and Ezra Klein genuinely seem to care about EA causes (Dylan on just about everything, even AI risk [a change from his previous position], and Ezra at least on veganism). Hiring Kelsey Piper is one clear example of this—she had no prior journalism experience, as far as I’m aware, but had very strong domain knowledge and a commitment to EA goals. Likewise, the section’s Community Manager, Sammy Fries, also had a background in the EA community.
It would have been easy for Vox to hire people with non-EA backgrounds who had more direct media experience, but they did something that probably made their jobs a bit harder (from a training standpoint). This seems like information we shouldn’t ignore (though of course, for all I know, Sammy and Kelsey may have been the best candidates even without their past EA experience).
Really good journalism is hard to produce, and just like any other outlet, Vox often succumbs to the desire to publish more pieces than it can fact-check. And some of their staff writers aren’t very good, at least in the sense that we wish they were good.
But still, because of Future Perfect, there has been more good journalism about EA causes in the last few months than in perhaps the entirety of journalism before that time. The ratio of good EA journalism to bad is certainly higher than it was before.
There is a model you could adopt under which the raw amount of bad journalism matters more than the good/bad ratio, because one bad piece can cause far more damage than any good piece can undo, but you don’t seem to have argued that Vox is going to damage us in that sense, and it seems like their most important/central pieces about core EA causes generally come from Kelsey Piper, who I trust a lot.
I agree that some of Vox’s work is flawed and systematically biased, but they’ve also produced enough good work that I hope to see them stick around. What’s more, the existence of Future Perfect may lead to certain good consequences, perhaps including:
Other news outlets hiring people with EA backgrounds to write on similar topics, following in Vox’s footsteps.
News outlets using Future Perfect as a source when they write about EA issues (I’d much prefer a journalist learning about AI risk start with Piper than other mass-media articles on the subject).
Other EA people working with Vox and gaining valuable insight into how the media works; even if it turns out that we should try not to engage with the media whenever possible, at least having a few people who understand it seems good.
I think Vox, Ezra Klein, Dylan Matthews etc would disagree about point 2. Not to put words in someone else’s mouth, but my sense is that Ezra Klein doesn’t think that their coverage is substantially flawed and systematically biased relative to other comparable sources. He might even argue that their coverage is less biased than most sources.
Could you link to some of the criticisms you mentioned in point 1? I’ve seen others claim that as well on previous EA Forum posts about Future Perfect, and I think it would be good to have at least a few sources on this. Many EAs outside the US probably know very little about Vox.
3. I have no personal or inside info on Future Perfect, Vox, Dylan Matthews, Ezra Klein, etc. But it seems like they’ve got a fair bit of respect for the EA movement—they actually care about impact, and they’re not trying to discredit or overtake more traditional EA figureheads like MacAskill and Singer.
Therefore I think we should be very respectful towards Vox, and treat them like ingroup members. We have great norms in the EA blogosphere about epistemic modesty, avoiding ad hominem attacks, viewing opposition charitably, etc. that allow us to have much more productive discussions. I think we can extend that relationship to Vox.
Using this piece as an example, if you were criticizing Rob Wiblin’s podcasting instead of Vox’s writing, I think people might ask you to be more charitable. We’re not anti-criticism—We’re absolutely committed to truth and honesty, which means seeking good criticism—but we also have well-justified trust in the community. We share a common goal, and that makes it really easy to cooperate.
Let’s trust Vox like that. It’ll make our cooperation more effective, we can help each other achieve our common goal, and, if necessary, we can always take back our trust later.
Update: Upon further evidence I’ve decided that this model is more wrong than right, specifically 2 is not really the case, it is more a matter of implicit bias among their staff, they more genuine and less Machiavellian than I thought. So cooperation is easier.
My mental model is this -
1. Vox has produced flawed and systematically biased media for years. (evidence: their previous political reporting and the various criticisms it’s received, similar to Breitbart for instance)
2. Vox knows that they have produced flawed and systematically biased media for years, but continues doing it anyway because it maximizes their revenue and it furthers their political goals. (evidence: they’re not idiots, and the criticisms seem sound, and they do not retract or walk back from their contested media)
3. If Vox cared significantly about the EA movement, they wouldn’t produce flawed and systematically biased media in the EA column.
For this reason I do not give them the benefit of the doubt, though I’m aware that enough messy background and priors are involved to make this disagreement difficult to resolve in any concise conversation.
I agree with the other respondent that Dylan Matthews and Ezra Klein genuinely seem to care about EA causes (Dylan on just about everything, even AI risk [a change from his previous position], and Ezra at least on veganism). Hiring Kelsey Piper is one clear example of this—she had no prior journalism experience, as far as I’m aware, but had very strong domain knowledge and a commitment to EA goals. Likewise, the section’s Community Manager, Sammy Fries, also had a background in the EA community.
It would have been easy for Vox to hire people with non-EA backgrounds who had more direct media experience, but they did something that probably made their jobs a bit harder (from a training standpoint). This seems like information we shouldn’t ignore (though of course, for all I know, Sammy and Kelsey may have been the best candidates even without their past EA experience).
Really good journalism is hard to produce, and just like any other outlet, Vox often succumbs to the desire to publish more pieces than it can fact-check. And some of their staff writers aren’t very good, at least in the sense that we wish they were good.
But still, because of Future Perfect, there has been more good journalism about EA causes in the last few months than in perhaps the entirety of journalism before that time. The ratio of good EA journalism to bad is certainly higher than it was before.
There is a model you could adopt under which the raw amount of bad journalism matters more than the good/bad ratio, because one bad piece can cause far more damage than any good piece can undo, but you don’t seem to have argued that Vox is going to damage us in that sense, and it seems like their most important/central pieces about core EA causes generally come from Kelsey Piper, who I trust a lot.
I agree that some of Vox’s work is flawed and systematically biased, but they’ve also produced enough good work that I hope to see them stick around. What’s more, the existence of Future Perfect may lead to certain good consequences, perhaps including:
Other news outlets hiring people with EA backgrounds to write on similar topics, following in Vox’s footsteps.
News outlets using Future Perfect as a source when they write about EA issues (I’d much prefer a journalist learning about AI risk start with Piper than other mass-media articles on the subject).
Other EA people working with Vox and gaining valuable insight into how the media works; even if it turns out that we should try not to engage with the media whenever possible, at least having a few people who understand it seems good.
I think Vox, Ezra Klein, Dylan Matthews etc would disagree about point 2. Not to put words in someone else’s mouth, but my sense is that Ezra Klein doesn’t think that their coverage is substantially flawed and systematically biased relative to other comparable sources. He might even argue that their coverage is less biased than most sources.
Could you link to some of the criticisms you mentioned in point 1? I’ve seen others claim that as well on previous EA Forum posts about Future Perfect, and I think it would be good to have at least a few sources on this. Many EAs outside the US probably know very little about Vox.