I have released a new episode of my podcast, EA Critiques, where I interview David Thorstad. David is a researcher at the Global Priorities Institute and also writes about EA on his blog, Reflective Altruism.
In the interview we discuss three of his blog post series:
Existential risk pessimism and the time of perils: Based on his academic paper of the same name, David argues that there is a surprising tension between the idea that there is a high probability of extinction (existential risk pessimism) and the idea that the expected value of the future, conditional on no existential catastrophe this century, is astronomically large.
Exaggerating the risks: David argues that the probability of an existential catastrophe from any source is much lower than many EAs believe. At time of recording the series only covered risks from climate change, but future posts will make the same case for nuclear war, pandemics, and AI.
Billionaire philanthropy: Finally, we talk about the the potential issues with billionaires using philanthropy to have an outsized influence, and how both democratic societies and the EA movement should respond.
As always, I would love feedback, on this episode or the podcast in general, and guest suggestions. You can write a comment here, send me a message, or use this anonymous feedbackform.
Podcast Interview with David Thorstad on Existential Risk, The Time of Perils, and Billionaire Philanthropy
Link post
I have released a new episode of my podcast, EA Critiques, where I interview David Thorstad. David is a researcher at the Global Priorities Institute and also writes about EA on his blog, Reflective Altruism.
In the interview we discuss three of his blog post series:
Existential risk pessimism and the time of perils: Based on his academic paper of the same name, David argues that there is a surprising tension between the idea that there is a high probability of extinction (existential risk pessimism) and the idea that the expected value of the future, conditional on no existential catastrophe this century, is astronomically large.
Exaggerating the risks: David argues that the probability of an existential catastrophe from any source is much lower than many EAs believe. At time of recording the series only covered risks from climate change, but future posts will make the same case for nuclear war, pandemics, and AI.
Billionaire philanthropy: Finally, we talk about the the potential issues with billionaires using philanthropy to have an outsized influence, and how both democratic societies and the EA movement should respond.
As always, I would love feedback, on this episode or the podcast in general, and guest suggestions. You can write a comment here, send me a message, or use this anonymous feedback form.