What’s going on in video in AI Safety these days? (A list)
My write-up as of September 2025 (updated December 2025) - feel free to ask me for a more updated Source of Truth if you’re interested in this space—and please let me know what I’m missing!
Key Players Making Video:
Organizations
Does research on AI for policymakers and communicators.
Has started its own video program—check out their videos! Run by Petr Lebedev, an ex-producer/scriptwriter for Veritasium
Future of Life Institute (FLI): Has an accelerator program to fund creators making AI-related videos as well as a “Keep the Future Human” contest
BlueDot Impact: is contracting with Mateus de Sousa to make shortform videos
AI Digest: Produces explainers on AI and as far as I know might be interested in someone running a video program for them
Control AI is getting great clips from podcasts and putting them up on Twitter. I’ve suggested that they should create a spreadsheet of these so everyone can use them in their videos. They’ve also been supporting existing creators to make videos about AI Safety, like this SciShow video
Conor Axiotes: making an AI documentary
Michael Trazzi: Made the SB 1047 Documentary and makes TikToks
80,000 Hours Video Program (AI in Context):
Two full-time people (Chana Messinger, Aric Floyd) and contractors.
Our main effort is the YouTube channel “AI in Context.”
Eventually we might look to expand—more videos, more channels, maybe pitching to streaming services
Scriptwriting is our current biggest bottleneck
We contract for production and film people (lights, camera work, sound)
80,000 Hours podcast is released as videos
Martin Percy is an experienced filmmaker making interactive films about AI such as AI Basics: Thrills or Chills
Giving What We Can is not an AI safety organization, but is starting up a video program headed by Justin Portela
“Fancier Crowd” that I’m going to be vague about: People with experience in Netflix pitches and successful political social media campaigns are becoming more involved in communicating AI safety through video, often independent of EA organizations
Organizations that might be interested in doing more video
Seismic Foundation: aims to raise public awareness and engagement with AI safety through broad-reach, relatable content for targeted groups of the general public, such as series and documentaries. They’re also starting up a content engine—they’re looking for creator partnerships, amplifying content or ideas from other AI Safety organizations and they’re on the hunt for a host / talent for some of their projects.
Youtubers / Tiktokkers
Not a complete list of people in this space
Longform
Siliconversations: A channel that has received funding from the Future of Life Institute (FLI), makes AI Safety videos
John Leaver: Runs Digital Engine, Pindex, and Inside AI; has been very helpful and is looking to start a new channel, often seeking hosts.
Drew Spartz runs a very successful YouTube channel on AI Safety, had a multi-million view video on AI 2027
Explainable AI are pairing with AI organisations that want to increase the public exposure of their research / work. Thus far, we have done campaigns for CAIS, Apollo Research and Control AI papers. It is our hope to both increase our network of creators making content, as well as talk to any AI orgs who may be interested in having their work exposed more. See: https://www.instagram.com/explainable__ai/
Mithuna Yoganathan, a physics YouTuber, recently made a video on AI 2027
Computerphile posted a lot of videos about AI Safety
Spencer Greenberg is ramping up making YouTube and shortform videos and has a team of people working with him, including Liam Elkins
There are also points of connection to CGP Grey, Veritasium and Kurzgesagt
Kris Gasteratos is making excellent videos about cultivated meat; I think there’s a lot he’s learned that I’d like to learn from
(Copied from Matrice’s excellent comment below): There’s a decent amount of French-speaking ~AI safety content on YouTube:
@Shaïman Thürler’s channel Le Futurologue
@Gaetan_Selle 🔷 ’s channel The Flares
@len.hoang.lnh’s channel Science4All and Thibaut Giraud’s channel Monsieur Phi, the two channels cited by 9 of the 17 people citing a YouTube channel as where they first heard of EA in the 2020 EA Survey
The CeSIA advised for the production of this video, which reached nearly five million views by the time I am writing this (i.e. plausibly >10% of the French adult population)
The CeSIA also gave an interview to popular scientific-skeptic channel La Tronche en Biais, which has expressed interest in posting more on the topic of AI safety more recently
David Louapre, who runs Science Étonnante, one of the most popular French-language science channels, has just this week announced pivoting to working as an AI safety researcher at Hugging Face, so it’s possible more will come from that direction too
Tiktok
Zac of the Neural guide pivoted recently to talking about AI Safety at Explainable
parthknowsai: daily videos explaining AI news, sometimes AI safety relevant papers
nicole_flowing: talks about the implications of AI, and sometimes AI Safety related stuff
_dbrogle: daily technical AI explainers + some other stuff
PhilosopherGames: Recent addition around December 2025, won a prize in the Keep the Future Human contest
Filmmakers
Kate Hammond, of https://www.softcutfilms.com/ is making an AI Safety Documentary
Elizabeth Cox, of Should We Studios, producing Ada, about pressing problems
Other points of connection to Hollywood
Harrison Wood is a video producer who has been working with EA organizations
Retreats / coordination / upskilling
In conjunction with Mox, Explainable is putting on the Frame Fellowship
I’ve heard about two other people who are considering putting on coordination retreats for people interested in this space.
Lots of people are interested in working on this / helping out:
e.g.
Another ex-big channel producer/scriptwriter might be interested in getting involved
We know several people from the ex-Veritasium crowd
Some of the people who worked on Michael Trazzi’s documentary
People in the “EA Youtuber” WhatsApp group chat
I know of two videographers interested in x-risk looking for work
Someone who’s done videography on both sides of the camera and has done scriptwriting before and is poking around the space
Someone making tools for video creators
And others!
I also have a long list of folks from the expression of interest forms I have out
Links where you can find more information
Where can I find videos about AI safety? (tries to be quite thorough and is a great resource, but didn’t satisfy my personal use case for the kind of video content that is becoming more common now, with more urgency and higher production values)
Reach out if you’d like to be involved, have experience in video, or know people who do!
- Retrospective and Learnings from AI in Context’s First Two Videos by (24 Dec 2025 22:05 UTC; 155 points)
- Where I Am Donating in 2025 by (22 Nov 2025 23:21 UTC; 90 points)
- The day Elon Musk’s AI became a Nazi (and what it means for AI safety) | New video from AI in Context by (2 Oct 2025 20:09 UTC; 62 points)
- How to Improve EA’s Presence on YouTube by (5 Jan 2026 18:34 UTC; 47 points)
- Where I Am Donating in 2025 by (LessWrong; 28 Nov 2025 5:07 UTC; 31 points)
- 's comment on Rethinking The Impact Of AI Safety Videos: Extending Austin & Marcus’ framework by (17 Sep 2025 16:13 UTC; 4 points)
There’s a decent amount of French-speaking ~AI safety content on YouTube:
@Shaïman Thürler’s channel Le Futurologue
@Gaetan_Selle 🔷 ’s channel The Flares
@len.hoang.lnh’s channel Science4All and Thibaut Giraud’s channel Monsieur Phi, the two channels cited by 9 of the 17 people citing a YouTube channel as where they first heard of EA in the 2020 EA Survey
The CeSIA advised for the production of this video, which reached nearly five million views by the time I am writing this (i.e. plausibly >10% of the French adult population)
The CeSIA also gave an interview to popular scientific-skeptic channel La Tronche en Biais, which has expressed interest in posting more on the topic of AI safety more recently
David Louapre, who runs Science Étonnante, one of the most popular French-language science channels, has just this week announced pivoting to working as an AI safety researcher at Hugging Face, so it’s possible more will come from that direction too
Oh, nice!
On CGP Grey, he has 6.8M YouTube subs and seems to get the alignment concerns. He recently conveyed the alignment risk in this episode of his tech podcast (Revisiting Humans Need not Apply):
”AI is more like biological weapons because they can act autonomously and evolve beyond what you built. Nuclear bombs don’t walk out of factories on their own, pathogens do.”
Might be worth someone reaching out about e.g. sponsorship.
Glad you’re working with some of the people I recommended to you, I’m very proud of that SB-1047 documentary team.
I would add to the list Suzy Shepherd who made Writing Doom. I believe she will relatively soon be starting another film. I wrote more about her work here.
Thank you!
FAR AI posts recordings of talks from the events they organise on YouTube.
IMO that’s a different category—there’s a lot of that kind of thing as well and I’m glad it exists but I think it’s useful to separate out.
“Scriptwriting is our current biggest bottleneck” Can you elaborate a bit? Is the hard part deciding what the messages are that you want to deliver or rather how to deliver them?
Not sure if that’s the same distinction I would make but broadly just takes a long time to write a full script that we’re happy with, which includes figuring out the right structure, the high level narratives, the beats we want to hit, the takeaways, giving it a good emotional arc, etc.
@ChanaMessinger I think it would be good to add Hank Green’s interview with Nate and SciShow’s entry!
“People with experience in Netflix pitches and successful political social media campaigns are becoming more involved in communicating AI safety through video, often independent of EA organizations”- why aren’t we able to attract them towards this forum at least? Or maybe agents who are working with those people?