Here’s one question: which risks are you most concerned about?
I shy away from ranking risks, for several reasons:
The risks are often interrelated in important ways. For example, we analyzed a scenario in which geoengineering catastrophe was caused by some other catastrophe: http://sethbaum.com/ac/2013_DoubleCatastrophe.html. This weekend Max Tegmark was discussing how AI can affect nuclear war risk if AI is used for nuclear weapons command & control. So they’re not really distinct risks.
Ultimately what’s important to rank is not the risks themselves, but the actions we can take to reduce them. We may sometimes have better opportunities to reduce smaller risks. For example, maybe some astronomers should work on asteroid risks even though this is a relatively low probability risk.
Also, the answer to this question varies by time period. For, say, the next 12 months, nuclear war and pandemics are probably the biggest risks. For the next 50-100 years, we need to worry about these plus a mix of environmental and technological risks.
And who do you think has the power to reduce those risks?
There’s the classic Margaret Mead quote, “Never underestimate the power of a small group of committed people to change the world. In fact, it is the only thing that ever has.” There’s a lot of truth to this, and I think the EA community is well on its way to being another case in point. That is as long as you don’t slack off! :)
That said, I keep an eye on a mix of politicians, other government officials, researchers, activists, celebrities, journalists, philanthropists, entrepreneurs, and probably a few others. They all play significant roles and it’s good to be able to work with all of them.
I shy away from ranking risks, for several reasons:
The risks are often interrelated in important ways. For example, we analyzed a scenario in which geoengineering catastrophe was caused by some other catastrophe: http://sethbaum.com/ac/2013_DoubleCatastrophe.html. This weekend Max Tegmark was discussing how AI can affect nuclear war risk if AI is used for nuclear weapons command & control. So they’re not really distinct risks.
Ultimately what’s important to rank is not the risks themselves, but the actions we can take to reduce them. We may sometimes have better opportunities to reduce smaller risks. For example, maybe some astronomers should work on asteroid risks even though this is a relatively low probability risk.
Also, the answer to this question varies by time period. For, say, the next 12 months, nuclear war and pandemics are probably the biggest risks. For the next 50-100 years, we need to worry about these plus a mix of environmental and technological risks.
There’s the classic Margaret Mead quote, “Never underestimate the power of a small group of committed people to change the world. In fact, it is the only thing that ever has.” There’s a lot of truth to this, and I think the EA community is well on its way to being another case in point. That is as long as you don’t slack off! :)
That said, I keep an eye on a mix of politicians, other government officials, researchers, activists, celebrities, journalists, philanthropists, entrepreneurs, and probably a few others. They all play significant roles and it’s good to be able to work with all of them.