Thanks for your comments—really great to learn more about this topic.
1. Agree, and optimisation in some soft-EA fields could have benefit
2. Agree—all other things equal, it would be better to work directly on an x-risk. But it would benefit EA, for the later reasons given, to acknowledge climate change as a potential stressor on x-risk. The assumption I’m using is that climate change might be more tractable for a bigger pool of people. And if someone is concerned about x-risk but is an expert on renewable energy on the breakthrough of some new technology, then they could understand their work as reducing the overall portfolio of x-risk. And if direct x-risks are heavily oversubscribed, or based on only a small number of agents (e.g. nuclear), then perhaps there’s more leverage for some people on climate change.
3. Agree that we should be focusing on things which affect the overall trajectory of civilisation. Is climate change really an intractable problem? In that case, why do so many smart people at all these universities, and the UN and IPCC have reducing emissions as a goal? Is it maybe intractable to assume that we’ll get to net zero, but is it a worthwhile goal to push to lower the rate of warming to give us more time to adapt?
I don’t profess to have the answer, but I’d be interested in the debate. I worry that this discussion doesn’t have enough input from real experts in this space.
If climate change is intractable, then what’s the next step? Should we be looking at geoengineering, adaptation, resilience? Assuming that climate change is intractable, then here are some other rough ideas of things that could help global welfare:
Early warning systems for floods in areas with anticipated rising sea levels
Research onto how societies should manage heat stress
Research and development of more resilient infrastructure, e.g. energy, food, and water—even if it’s just theoretically a question of pricing water etc. accurately, I’m not confident that in practice globally we’re doing that very well at the moment
4,5 - Interesting to read, I don’t profess to be an expert so would appreciate learning from other perspectives.
6,7,8- I wonder whether it’s possible that some modelling is overconfident on how resilient societies will be to climate change, as we’re densely networked and there might be lots of unanticipated secondary effects, such as mosquitoes affecting another billion people. The latest UK adaptation report acknowledges biodiversity as one of the several areas urgently needing further research.
I’ll focus on point 2 because I think it is the most important. I don’t see the argument for it being true that for the vast majority of people, working on climate change promises more leverage on the problem of nuclear war, than does working directly on nuclear war. Nuclear war is easier to make progress on, more neglected and more important than climate change.
Thanks for your comments—really great to learn more about this topic.
1. Agree, and optimisation in some soft-EA fields could have benefit
2. Agree—all other things equal, it would be better to work directly on an x-risk. But it would benefit EA, for the later reasons given, to acknowledge climate change as a potential stressor on x-risk. The assumption I’m using is that climate change might be more tractable for a bigger pool of people. And if someone is concerned about x-risk but is an expert on renewable energy on the breakthrough of some new technology, then they could understand their work as reducing the overall portfolio of x-risk. And if direct x-risks are heavily oversubscribed, or based on only a small number of agents (e.g. nuclear), then perhaps there’s more leverage for some people on climate change.
3. Agree that we should be focusing on things which affect the overall trajectory of civilisation. Is climate change really an intractable problem? In that case, why do so many smart people at all these universities, and the UN and IPCC have reducing emissions as a goal? Is it maybe intractable to assume that we’ll get to net zero, but is it a worthwhile goal to push to lower the rate of warming to give us more time to adapt?
I don’t profess to have the answer, but I’d be interested in the debate. I worry that this discussion doesn’t have enough input from real experts in this space.
If climate change is intractable, then what’s the next step? Should we be looking at geoengineering, adaptation, resilience? Assuming that climate change is intractable, then here are some other rough ideas of things that could help global welfare:
Early warning systems for floods in areas with anticipated rising sea levels
Research onto how societies should manage heat stress
Research and development of more resilient infrastructure, e.g. energy, food, and water—even if it’s just theoretically a question of pricing water etc. accurately, I’m not confident that in practice globally we’re doing that very well at the moment
4,5 - Interesting to read, I don’t profess to be an expert so would appreciate learning from other perspectives.
6,7,8- I wonder whether it’s possible that some modelling is overconfident on how resilient societies will be to climate change, as we’re densely networked and there might be lots of unanticipated secondary effects, such as mosquitoes affecting another billion people. The latest UK adaptation report acknowledges biodiversity as one of the several areas urgently needing further research.
I’ll focus on point 2 because I think it is the most important. I don’t see the argument for it being true that for the vast majority of people, working on climate change promises more leverage on the problem of nuclear war, than does working directly on nuclear war. Nuclear war is easier to make progress on, more neglected and more important than climate change.