Can anyone recommend to me some work on existential threats as a whole? I don’t just mean AI or technology related threats but nuclear war, climate change, etc.
Btw Nick Bostrom’s Superintelligence is already at the top of my reading list, and I know Less Wrong is currently engaged in a reading group on that book.
For book-length detail, Global Catastrophic Risks by Bostrom and Cikovic is good. Others, that I haven’t read, are Our Final Hour by Martin Rees and Catastrophe: Risk and Response by Richard Posner
I haven’t read it myself, but I believe that the book ‘Global Catastrophic Risks’ (edited by y Nick Bostrom, Milan Cirkovic, and Martin Rees) covers a broad range. Here are links to it (5% of Amazon purchases through them go to SCI): US; UK.
Can anyone recommend to me some work on existential threats as a whole? I don’t just mean AI or technology related threats but nuclear war, climate change, etc.
Btw Nick Bostrom’s Superintelligence is already at the top of my reading list, and I know Less Wrong is currently engaged in a reading group on that book.
For overviews, I recommend: Preventing Human Extinction by Beckstead, Wage and Singer Existential Risk as a Global Priority by Bostrom Reducing the Risk of Human Extinction by Matheny GCR Survey by Sandberg and Bostrom
For book-length detail, Global Catastrophic Risks by Bostrom and Cikovic is good. Others, that I haven’t read, are Our Final Hour by Martin Rees and Catastrophe: Risk and Response by Richard Posner
I haven’t read it myself, but I believe that the book ‘Global Catastrophic Risks’ (edited by y Nick Bostrom, Milan Cirkovic, and Martin Rees) covers a broad range. Here are links to it (5% of Amazon purchases through them go to SCI): US; UK.
You can also read The Open Philanthropy Project’s (previously GiveWell Labs) notes on the x-risks they’ve investigated.