(Commenting before watching the video) I think you’re actually understating how much Niel knows:
During the 2008 presidential election Niel was a member of President Obama’s Energy and Environment Policy Team. Niel sat on the Executive Committee of the G8 Research Group: UK, was the Executive Director of Climatico, and was Climate Science Advisor to the Office of the President of the Maldives.
I also recall him doing some climate stuff for the UN, but it wasn’t listed there. In light of that, I think I’m pretty comfortable just outside-viewing and differing to him on model uncertainty and extreme low-probability outcomes.
Or perhaps “Is climate change important in the long-term?” in which case, if we’re thinking across multiple centuries, even with lots of technological development, if we’re looking at >6C in 2300 (to pick an example), then I think the answer is yes.
I don’t know, I think even if temperatures rise by 10C, humanity will still survive, but yes, then climate change will be very very important.
Re timing: I picked 2100 because that’s where most forecasts I see are, but I think it’s defensible because I have an inside-view that things will most likely get crazy for other reasons by then. I think this is outside-view defensible because world GDP has doubled every ~20 years since 1950, so very naively we might expect to see a world that’s 16x(!) richer by 2100. Even if we don’t speculate on specific technologies, it seems hard to imagine a 16x richer world that isn’t meaningfully different from our own in hard to predict and hard to plan for ways.
All of this being said, I still think it’s a fair to argue that AI, bio, and nuclear are more neglected and tractable relative to climate change.
I agree about neglectedness. I’m agnostic towards whether climate change mitigation is more tractable than biosecurity or AGI safety.
(Commenting before watching the video) I think you’re actually understating how much Niel knows:
From https://www.fhi.ox.ac.uk/team/niel-bowerman/
I also recall him doing some climate stuff for the UN, but it wasn’t listed there. In light of that, I think I’m pretty comfortable just outside-viewing and differing to him on model uncertainty and extreme low-probability outcomes.
I don’t know, I think even if temperatures rise by 10C, humanity will still survive, but yes, then climate change will be very very important.
Re timing: I picked 2100 because that’s where most forecasts I see are, but I think it’s defensible because I have an inside-view that things will most likely get crazy for other reasons by then. I think this is outside-view defensible because world GDP has doubled every ~20 years since 1950, so very naively we might expect to see a world that’s 16x(!) richer by 2100. Even if we don’t speculate on specific technologies, it seems hard to imagine a 16x richer world that isn’t meaningfully different from our own in hard to predict and hard to plan for ways.
I agree about neglectedness. I’m agnostic towards whether climate change mitigation is more tractable than biosecurity or AGI safety.