A list of EA-related podcasts
Podcasts are a great way to learn about EA. Here’s a list of the EA-related podcasts I’ve come across over the last few years of my podcast obsession.
I’ve split them up into two categories:
Strongly EA-related podcasts: Podcasts run by EA organisations or otherwise explicitly EA-related.
Podcasts featuring EA-related episodes: Podcasts which are usually not EA-related but have some episodes which are about an EA idea or interviewing an EA-aligned guest.
Please add to the comments any podcasts that I have missed. I am always excited to find out about more interesting podcasts!
Strongly EA-related podcasts
Doing Good Better Podcast- Five short episodes about EA concepts. Produced by the Centre for Effective Altruism. No new content since 2017.
The Life You Can Save Podcast- Episodes from Peter Singer’s organisation that focus on alleviating global poverty. The latest episodes are interviews with EA organisation staff.
The Turing Test—The newly restarted EA podcast from the Harvard University EA group. Interviews with EA thinkers including Brian Tomasik on ethics, animal welfare, and a focus on suffering, and Scott Weathers on Charity Science Health.
80,000 Hours Podcast—Robert Wiblin leads long-form interviews (up to 4 hours) with individuals in high impact careers. This podcast really gets into the weeds of the most important cause areas.
Global Optimum—An informal podcast by professional psychology researcher, Daniel Gambacorta. Discussing psychology results that can help you become a more effective altruist. There is usually no extra padding in this podcast, it’s straight to the point.
Future Perfect Podcast—The podcast part of Vox Media’s Future Perfect project. Dylan Matthews leads scripted discussions about interesting and hopefully effective ways to improve the world.
Morality is hard—Michael Dello Iacovo interviews guests about topics related to effective animal advocacy.
Future of Life Podcast—Interviews with researchers and thought leaders who the Future of Life Institute believe are helping to “safeguard life and build optimistic visions of the future”. They include a series on AI alignment and a recent series on climate change.
Wildness—A new podcast of Wild Animal Initiative. Narrative episodes based around a theme relevant to wild animal welfare research, typically including multiple interviews with animal welfare researchers.
EARadio—hundreds of audio recordings from EA Global talks. Some episodes are hard to follow due to the missing visual information that is used in presentations.
Sentience Institute Podcast - New podcast on effective animal advocacy.
Podcasts featuring EA-related episodes
Our Hen House—Jacy Reese on the end of animal farming; Joey Savoie on using charity entrepreneurship to help animals.
The Joe Rogan Experience—Nick Bostrom on the simulation argument; Will Macaskill on EA.
The Most Interesting People I Know—Chloe Cockburn on US justice system reform; Lewis Bollard on ending factory farming; Spencer Greenberg on lots of things related to EA; Andres Gomez Emilsson of Qualia Research Institute on solving consciousness.
Autocracy and Transhumanist Podcast—Phil Torres, Seth Baum, and Anders Sandberg on the long-term future; Jeff Sebo on the moral value of other minds.
The Future Thinkers—Phil Torres on the long-term future and existential risks; Daniel Schmachtenberger on generator functions for existential risks, global phase shift, and mitigating existential risks.
Making Sense with Sam Harris—Lots of episodes about consciousness, meaning, and ethics. In particular: Will Macaskill on EA; Nick Bostrom on existential risks; Eliezer Yudkowsky on AI.
Philosophise This—A brilliant episode on Peter Singer and effective altruism.
Econ Talk—Bjorn Lomborg of Copenhagen Consensus on cost-effectiveness estimates and climate change. There are many other episodes that are related to global development, economic prosperity, political change, etc. Also, see Robert Wiblin’s 75 favourite Econ Talk episodes.
The Dissenter—Jacy Reese on the end of animal farming; Peter singer on ethics, veganism, and EA.
The Good Life—Robert Wiblin on altruistic careers.
Rationally Speaking—Dylan Matthews on kidney donation and global poverty; Helen Toner on misconceptions about AI; Kelsey Piper on Future Perfect; Robert Wiblin on an updated view of the best ways to help humanity; Spencer Greenberg on improving the research process; Anders Sandberg and Owen Cotton-Barratt on the future (separate episodes); Amanda Askell on low risks with high stakes; Will Macaskill on moral uncertainty; Phil Tetlock on superforecasting; Peter Singer on utilitarianism; Holden Karnofsky on Givewell.
Mission Daily—Robert Wiblin on existential risks; Will Macaskill on EA; Elie Hassenfeld on Givewell.
Giving Thought—A few episodes on EA’s take on philanthropy, including some criticisms.
Very Bad Wizards—Will Macaskill on EA and moral uncertainty.
Tanner Lectures - University of Oxford lectures about philanthropy recorded for podcast. Here’s one about EA: “From Moral Neutrality to Effective Altruism: The Changing Scope and Significance of Moral Philosophy”.
*Edited to include hyperlinks.
- Listen to more EA content with The Nonlinear Library by 19 Oct 2021 12:24 UTC; 187 points) (
- A ranked list of all EA-relevant (audio)books I’ve read by 17 Feb 2021 10:18 UTC; 116 points) (
- Interested in EA/longtermist research careers? Here are my top recommended resources by 26 Jun 2022 17:13 UTC; 112 points) (
- Notes on EA-related research, writing, testing fit, learning, and the Forum by 27 Mar 2021 9:52 UTC; 98 points) (
- A ranked list of all EA-relevant documentaries, movies, and TV series I’ve watched by 21 Feb 2021 11:00 UTC; 81 points) (
- Suggestion: EAs should post more summaries and collections by 9 Mar 2020 10:04 UTC; 76 points) (
- Listen to top LessWrong posts with The Nonlinear Library by 19 Oct 2021 12:24 UTC; 74 points) (LessWrong;
- How have you become more (or less) engaged with EA in the last year? by 8 Sep 2020 18:28 UTC; 57 points) (
- How and why to turn everything into audio by 11 Aug 2022 8:49 UTC; 55 points) (
- New: use The Nonlinear Library to listen to the top EA Forum posts of all time by 9 Apr 2022 20:29 UTC; 43 points) (
- New: use The Nonlinear Library to listen to the top LessWrong posts of all time by 9 Apr 2022 20:50 UTC; 39 points) (LessWrong;
- Where to find EA-related videos by 2 Mar 2020 13:40 UTC; 33 points) (
- 23 Sep 2020 8:34 UTC; 25 points) 's comment on MichaelA’s Quick takes by (
- EA Updates for December 2019 by 3 Jan 2020 15:23 UTC; 23 points) (
- How to get up to speed on a new field of research? by 1 Mar 2021 0:36 UTC; 21 points) (
- Notes on effective-altruism-related research, writing, testing fit, learning, and the EA Forum by 28 Mar 2021 23:43 UTC; 14 points) (LessWrong;
- 27 Jul 2020 0:08 UTC; 5 points) 's comment on Four EA podcast episodes by (
- 11 Jul 2022 23:28 UTC; 4 points) 's comment on EA for dumb people? by (
- 6 Jan 2021 3:19 UTC; 4 points) 's comment on 10 Habits I recommend (2020) by (
- 1 Nov 2020 17:24 UTC; 3 points) 's comment on New 3-hour podcast with Anders Sandberg about Grand Futures by (
Thanks for compiling this.
I’ve created a ListenNotes list with all the “Strongly EA-related podcasts” and a few others here. It displays the most recent episode from each of those podcasts and lets you import them all easily to your favorite podcast app.
Two more podcasts:
Increments by Ben Chugg and Vaden Masrani.
Clearer Thinking by Spencer Greenberg of Spark Wave.
Luke Muehlhauser’s Conversations from the Pale Blue Dot had an episode interviewing Toby Ord back in January 2011. This is from before the term “effective altruism” was being used to describe the movement. I think it may be the first podcast episode to really discuss what would eventually be called EA, with the second oldest podcast episode being Massimo Pigliucci’s interview with Holden Karnofsky on Rationally Speaking in July 2011.
(There was plenty of discussion online about these issues in years prior to this, but as far as I can tell, discussion didn’t appear in podcast form until 2011.)
New podcast on AI X-risk research: AXRP.
Although this was announced in a separate EA Forum post, I’m adding a comment here so that all EA-related podcasts can be found in the same thread.
Note that I continue to keep this list updated.
Fin Moorhouse and a friend started the podcast Hear This Idea. Fin writes about the podcast here, and says:
And a few of the interviewees are EAs whose names I recognise. (Some others maybe be EAs who I just happen to not know of.)
Alignment Newsletter Podcast: http://alignment-newsletter.libsyn.com/
The Lunar Society
(I haven’t listened yet and am not yet able to recommend it, but it seems EA-relevant)
Having listened to several episodes, I can strongly recommend this podcast. One of the very best.
Technical AI Safety Podcast
AI X-Risk Podcast
The Inside View also focuses on AI alignment. There’s a YouTube channel with videos of the interviews. Sometimes there are interview highlights on LessWrong.
Radio Bostrom—“Audio narrations of academic papers by Nick Bostrom.”
It would be real great if these were hyperlinks...
Would take some time, but might be useful for people gathering EA resources?
Done. Thanks for the nudge to put a little more time into it.
Nice! Thanks
Yes, agree that it would have been natural to include hyperlinks in this otherwise very helpful post.
Pablo’s list does include links.
Future Matters
A machine-read audio version of the Future Matters newsletter, which is:
Also the Future Matters Reader:
It seems like a lot of those writings weren’t on Nonlinear’s podcast feeds, either due to not being on the EA Forum / LessWrong / Alignment Forum or for some other reasons.
Two book-length series of rationality-related posts by Eliezer Yudkowsky have been made into podcast versions:
Rationality: From AI to Zombies (aka “the sequences”)
There are two different audio versions of this, both free:
The “official” version (search for “audiobook” on that page)
The unofficial version
Harry Potter and the Methods of Rationality (HPMOR)
(Not sure if those are the most useful links. Personally I just found the podcasts via searching the Apple Podcasts app.)
I found Rationality: From AI to Zombies very useful and quite interesting, and HPMOR fairly useful and very surprisingly engaging. I’ve ranked them as the 4th and 30th (respectively) most useful EA-related books I’ve read so far.
Founders Pledge now has a podcast, featuring interviews with their members and researchers: https://founderspledge.com/stories/category/podcasts
Cold Takes Audio is Holden Karnofsky reading posts from his new-ish blog site. I’d highly recommend it.
Nonlinear Library has machine-read (but still pretty good) versions of a large and increasing number of posts from the EA Forum, LessWrong, and the Alignment Forum. See https://forum.effectivealtruism.org/posts/JTZTBienqWEAjGDRv/listen-to-more-ea-content-with-the-nonlinear-library This is probably the podcast I’ve listened to most often since it came out, and will probably remain the podcast I listen to most often for the indefinite future.
Nonlinear Fund also now have additional podcast feeds:
Feeds for top posts of all time (or something like that) from the Alignment Forum, from the EA Forum, and from LessWrong
a feed called “the alignment section” which seems to be curated by them (and seems to me to be a pretty good selection)
I imagine they might release more feeds in future, so it may be worth occasionally searching for podcasts by “The Nonlinear Fund” in podcast apps to see if others come up.
I’ve been hitting these feeds pretty hard and really valuing them. Examples of cool things that this had allowed me to do are easily fit into my schedule Richard Ngo’s sequence on AGI Safety from First Principles, some old Luke Muehlhauser posts on science/rationality-based self-help approaches, and all core readings in the AGI Safety Fundamental governance course.
Naked Scientists (BBC radio show and podcast) have done a bunch of interviews with CSER researchers:
https://www.cser.ac.uk/news/naked-scientists-planet-b/
https://www.cser.ac.uk/news/haydn-belfield-interviewed-naked-scientists/
https://www.cser.ac.uk/news/workshop-featured-on-the-naked-scientists-podcast/
https://www.cser.ac.uk/news/podcast-countdown-artificial-intelligence/
https://www.cser.ac.uk/news/podcast-interviews-martin-rees/
Conversations with Tyler
The Portal with Eric Weinstein
Something I’m surprised neither I nor anyone else has mentioned yet: the Slate Star Codex Podcast. This consists almost entirely of audio versions of SSC articles, along with a handful of recordings of SSC meetups (presentations + Q&As).
(I think this is my second favourite EA-related podcast, with the 80k podcast being first.)
Sorry for the late comment. I’ve recently been listening to, and enjoying, The End of the World with Josh Clark. It seems like a really solid and approachable introduction to existential risks. It starts by covering why x-risks might be things that we should be concerned about, and then talks about AI, biosecurity and other possible threats. Includes interviews with Nick Bostrom, Toby Ord, Anders Sandberg, Robin Hanson and others :)
Un equilibrio inadecuado (Spotify—Apple Podcasts—Google Podcasts)
Interviews in Spanish on EA topics. I particularly enjoyed the episode with Andrés Gómez Emilsson from Qualia Research Institute. Sadly, no new content since October 2021.
Thank you for sharing this.
Alex Lintz made a collection of AI Governance-related Podcasts, Newsletters, Blogs, and more, through which I’ve found some podcasts or individual podcast episodes that I’ve found helpful.
There are some biorisk and biosecurity podcasts or podcast episodes collected in the “Talks, Podcasts, and Videos” section of A Biosecurity and Biorisk Reading+ List
There is a German EA Podcast that Lia Rodehorst and I created, called “Gutes Einfach Tun”.
Here is the link.
Also, Sarah Emminghaus recently launched a German EA Podcast called “WirklichGut” (link here).
Recent addition: Founders Pledge have started a podcast called How I Give.
There’s also the podcast NonProphets:
I came across it because one of the three hosts is Robert de Neufville, who works with the Global Catastrophic Risk Institute.
I’ve only listened to one episode so far, but several seem fairly EA-related (e.g., one with Shahar Avin from CSER talking about AI), as one might expect given de Neufville’s involvement.
Ace! this is the first time I’ve heard of that podcast. Thanks for sharing.
A new podcast transcription of Nate Soares Replacing Guilt—https://anchor.fm/guilt
Sentience Institute released a new podcast on effective animal advocacy just today!
I also recommend Joe Carlsmith Audio.
Joe reads the essays himself.
80k After Hours
ChinaTalk
I’d recommend this for people interested in AI governance or otherwise interested in things like Chinese policymaking.
Metaculus Journal
National Security Commission on AI
A podcast related to the Final Report of the National Security Commission on Artificial Intelligence (NSCAI, 2021).
I’d recommend people focused on AI governance listen to at least some episodes. But unfortunately I often felt that the interviews (a) didn’t grab my attention and (b) weren’t really saying much of substance in a crisp and clear way (I had a feeling sort-of as if they were talking in platitudes or talking vaguely around some topics—but this may have been just because I already knew a decent amount of because I was spacing out).
The Asianometry Podcast
I’ve found this podcast really useful for getting up to speed on some topics relevant to compute governance (which I’m interested in due to my and my colleague’s AI governance work). So, to be clear, this is an “EA-relevant” podcast that I’d recommend for people with an interest in that area, but it’s not an “EA podcast” (I don’t think the host is involved in the EA community and I’m not sure if he’s even aware of it).
There’s also the podcast Utilitarian. Only two episodes so far, but one is with Anders Sandberg, and the podcast’s creator posted about that episode on the Forum, so this can probably be considered an EA-related podcast.
(I’m currently halfway through the Anders episode, and enjoying it so far.)
Very Bad Wizards: The One with Peter Singer (released in April, 2020)
I like that podcast a lot! I suggest to skip directly to 31:20, the second part where Singer comes in, unless you are interested in half an hour of discussion about typography :)
These are some links to podcast links aimed at children that touch on EA topics.
https://www.buzzsprout.com/1018843/episodes/9252374 -Meet Vaidehi Agarwalla—An Effective Altruism Community Builder
https://www.buzzsprout.com/1018843/episodes/8940898 Meet Wanyi Zeng—The Executive Director of Effective Altruism Singapore
https://www.buzzsprout.com/1018843/episodes/3754505
Meet Elissa Lane—A Farm Animal Welfare Expert
https://www.buzzsprout.com/1018843/episodes/4040834 Meet Abhay Rangan—A man dedicated to make plant based milk affordable and accessible
https://www.buzzsprout.com/1018843/episodes/3950246 Meet Varun Deshpande—A man on a mission for smart food systems
https://www.buzzsprout.com/1018843/episodes/3858503 Meet Manjunath—A Free-range egg farmer
This is the general link to the podcast
https://podcasts.apple.com/us/podcast/curious-vedanth/id1508532011
https://open.spotify.com/show/7ekQ9OoUrEVuavsudEO3Md
Updating with another recent podcast: GiveDirectly’s Michael Faye and Caroline Teti on Important, Not Important. A really interesting and easy to listen to interview on the value of unconditional cash transfers and their underlying philosophy.
Invincible Wellbeing
Sentientism
The Good Timeline
Growth Podcast
Thanks for this. I’ve integrated this list as well as @pablo’s and a couple I added (‘Not Overthinking’ and ‘Great.com Talks With’) into an Airtable
View only
or you can on collaborate this base HERE
Why on Earth are you promoting Bjorn Lomberg, who has a history of misrepresenting climate science, here?
I think I’m done with this movement...
One person on a forum recommending a podcast doesn’t entail that podcast being recommended by the movement as a whole; this is true even if a large EA-aligned organization recommends something, let alone an individual creating a list of resources they like.
To the extent that Lomborg’s name comes up on the Forum elsewhere, he seems to be regarded as a controversial figure; see this discussion, for example.
There are several articles on 80000 hours and on GWWC’s websites in which they promote him though. I find it sickening and intellectually dishonest