AMA: Tom Ough, Author of ‘The Anti-Catastrophe League’, Senior Editor at UnHerd

Tom will answer the questions in the comments between 7 and 9 pm UK time on August 6th.

Tom Ough is a Senior Editor at UnHerd, co-host of the Anglofuturism podcast, and the author of a new book, ‘The Anti-Catastrophe League’. For the last few years, he has been writing in a range of publications about global catastrophic risks, and (sometimes relatedly) how Britain can prosper by adopting new technologies. Previously, he worked as a journalist at The Telegraph, where he wrote for nearly seven years.

You can see more of his articles on his website, and a few of his most EA-flavoured pieces below:

Recently, Tom published The Anti-Catastrophe League, a book profiling the individuals and organisations working to reduce global catastrophic risks. For readers of the EA Forum, many of the people profiled will be familiar, but the range of reporting in the book ensures that you’ll learn something new.

The Anti-Catastrophe League

Over to Tom, for a summary of his book:

INTRODUCTION: Defusing Doomsday

I introduce the concepts of GCRs and X-risk, relating an interview with Andy Weber, the American diplomat who tracked down the nuclear weapons material that post-Soviet Russia had lost control of.

CHAPTER 1: The Double Asteroid Redirection Test

A history of humanity’s relationship with asteroids, beginning with period (early 19th-century France) when it was unfashionable to believe in their existence, and continuing to the NASA-led deflection of an asteroid in 2022.

CHAPTER 2: A Raging Star

A disputed GCR: solar storms. I interview John Kappenman, a doubted prophet of doom; Kappenman thinks a bad solar storm would be very difficult for civilisation to recover from. I touch on prepping and EMPs.

CHAPTER 3: Stoppering a Supervolcano

Continuing the theme of natural risks, I broach the concept of volcano geoengineering. I interview Anders Sandberg about tampering with supervolcanoes in order to stop them causing volcanic winter.

CHAPTER 4: ‘We Will Die, But We Will Sink Them All!’

Nuclear weapons. I explain how low-budget science snowballed into the most powerful weapons ever created. I note some of the parallels and differences between the development of nuclear weapons and the development of AI. I relate, via an interview, the story of Rose Gottemoeller, who was a schoolgirl during the Cuban Missile Crisis and went on to lead the US team that negotiated the last nuclear arms deal with Russia.

CHAPTER 5: Feeding the Benighted

It might still go pear-shaped. What if there’s a nuclear, volcanic, or asteroid winter? I interview David Denkenberger of ALLFED and discuss the psychological burden of his line of work as well as the logistics of feeding a darkened world.

CHAPTER 6: Tearing off the Hairshirt

Climate. I bemoan the neglect of nuclear power. I discuss various attempts to create clean firm power. I touch on Casey Handmer’s Terraform Industries, which synthesises fuel from air, and interview Andrew Song, a renegade solar geoengineer.

CHAPTER 7: Supercritical

A deep dive, if you’ll pardon the pun, on geothermal power, which has imported some techniques and machinery from oil and gas, and finally looks like it could make meaningful contributions to baseload power. I cover the deepest hole in the world (the Kola Superdeep Borehole); the US government’s geothermal field lab; and site in Cornwall that I visited in 2022 or so. I touch on Quaise (transparent lasers that penetrate bedrock), interviewing two key figures (its founder, a converted oil and gas man; and its guiding light, as it were, a former fusion scientist), noting the technical challenges they face.

CHAPTER 8: The Misfit Prodigy

A biography of the Future of Humanity Institute, which, from 2005, pondered more speculative risks to humanity. I interview Nick Bostrom and many others. This chapter is an extended version of the article I wrote for Asterisk in 2024.

CHAPTER 9: Disease X

I posit a scenario in which an engineered pathogen escapes a lab and causes a pandemic more devastating than Covid-19. Imagining how we might defend ourselves, I refer to far-UVC light and to self-sterilising surfaces, among other measures.

CHAPTER 10: The West Siberian Hat Factory – and Worse

Bioweapons programmes have some history, which I nod to. I also talk about diplomatic efforts to head off pandemics.

CHAPTER 11: The Race for Alignment

Here I trace the influence of FHI on early work on AI alignment. I document the intellectual evolution of the field, interviewing Lee Sharkey and Neel Nanda on mech interp, and moving on to Buck Shlegeris’ breakaway from mech interp and his work on AI control.

CHAPTER 12: Poking the Shoggoth

Relying on insider accounts from 10 Downing Street and the UK AI Security Institute, I write about the nascent field of AI governance.

CHAPTER 13: Whistleblowers

I discuss the act of whistleblowing and interview Daniel Kokotajlo, co-author of the AI 2027 scenario.

CHAPTER 14: The Methuselarity

Here I write about the one that’ll get us all in the end: ageing. I interview some of the people at the cutting edge of longevity research.

CONCLUSION: Existential Hope

Give the ending away? Not likely! :)

What might you ask Tom?

Here are a few ideas for questions to get you thinking. But as always, ask anything!

  • How (and why) do British papers mis-prioritise extreme risks?

  • What have you learned in your reporting on GCRs that EAs don’t adequately recognise?

  • Which past article of yours would you now retract?

  • Were there any unexpected differences between writing a piece of reporting and writing your book?