Error
Unrecognized LW server error:
Field "fmCrosspost" of type "CrosspostOutput" must have a selection of subfields. Did you mean "fmCrosspost { ... }"?
Unrecognized LW server error:
Field "fmCrosspost" of type "CrosspostOutput" must have a selection of subfields. Did you mean "fmCrosspost { ... }"?
I think it would be a huge mistake to condition support for AI journalism on object level views like this. Being skeptical of rapid AI development is a perfectly valid opinion to have: and I think it’s pretty easy to make a case that the actions of some AI leaders don’t align with their words. Both of the articles you linked seem perfectly fine and provide evidence for their views: you just disagree with the conclusions of the authors.
If you want journalism to be accurate, you can’t prematurely cut off the skeptical view from the conversation. And I think skeptical blogs like Pivot-to-AI do a good job at compiling examples of failures, harms, and misdeployments of AI systems: if you want to build a coalition against harms from AI, excluding skeptics is a foolish thing to do.
I think this is really fair pushback, thanks! Skeptical coverage of AI development is legitimate. I think the way I wrote this over-implied that these articles is a failing of journalism—the marketing hype claim is not baseless.
But I’m torn. I still think there’s something off about current AI coverage, and this could be a valid reason to want more journalism on AI. Most articles seem to default to either full embrace of AI companies’ claims or blanket skepticism, with relatively few spotlighting the strongest version of arguments on both sides of a debate.
Also, I think my core point stands without conditioning on object-level views: we need more journalists who can dig deep into AI development. More investigation and scrutiny from all angles would serve us better than our current situation of relatively thin coverage.
“Most articles seem to default to either full embrace of AI companies’ claims or blanket skepticism, with relatively few spotlighting the strongest version of arguments on both sides of a debate. ” Never agreed with anything as strongly in my life. Both these things are bad and we don’t need to choose a side between them. And note that the issue here isn’t about these things being “extreme”. An article that actually tries to make a case for foom by 2027, or “this is all nonsense, it’s just fancy autocomplete and overfitting on meaningless benchmarks” could easily be excellent. The problem is people not giving reasons for their stances, and either re-writing PR, or just expressing social distaste for Silicon Valley, as a substitute.
I agree that for journalism it’s important to be very careful about introducing biases into the field.
On the other hand, I suspect the issue they are highlighting is more that some people are so skeptical that they don’t bother engaging with this possibility or the arguments for it at all.
Executive summary: Journalism on AI is a crucial but underdeveloped field that can shape public understanding, influence policy, and hold powerful actors accountable, yet it suffers from staffing shortages, financial constraints, and a lack of technical expertise.
Key points:
AI journalism has high potential—it can improve governance, highlight risks, shape public discourse, and investigate AI companies, as demonstrated by past impactful articles.
Current AI journalism is inadequate—click-driven revenue models discourage deep reporting, too few journalists cover AI full-time, and many outlets fail to take rapid AI development seriously.
More AI journalists are needed—individuals with technical, political, and investigative skills are in demand, and funders currently value AI journalism more than additional AI policy or safety researchers.
Journalism differs from advocacy—effective journalism prioritizes fact-finding and questioning over pushing specific solutions or ideologies.
The Tarbell Fellowship offers a path into AI journalism—it provides training, mentorship, funding, and placements at major news outlets, with applications for 2025 closing on February 28th.
This comment was auto-generated by the EA Forum Team. Feel free to point out issues with this summary by replying to the comment, and contact us if you have feedback.