Having looked at your sources I am not sure they justify the conclusions.
In particular:
Your sources for point 1 seem to ignore the >10% case that the world warms significantly more than expected (they generally look at mortality in the business as usual case).
Your sources for point 2 focus on whether climate change is truly existential, but do seem to point to a possibly if it being a global catastrophe. (Point 2 appears to be somewhat crucial, the other points, especially 1, 4, 5, 7 depend on this point.)
It seems plausible from looking at your sources that there are tail risks of extreme warming that could lead to huge global catastrophe (maybe not quite at your cut-off the 10% chance of 10% mortality level but huge).
Eg Halstead: ”On current pledges and promises, we’ll probably end up at around 700ppm by 2100 and increasing well beyond that.” “at 700ppm, … there is an 11% chance of an eventual >6 degrees of warming” ”at 1120ppm, there is between a 10% and 34% chance of >9 degrees of warming” “Heat stress … seems like it would be a serious problem for warming >6 degrees for large portions of the planet … With 11–12 °C warming, such regions would spread to encompass the majority of the human population as currently distributed” ”6 degrees would drastically change the face of the globe, with multi-metre sea level rises, massive coastal flooding, and the uninhabitability of the tropics.” “10 degrees … would be extremely bad”
Overall I expect these points 1 and 2 are quite possibly correct, but, having looked through your sources and concluded that they do not justify the points very well, I would have low epistemic status in these points.
Also on points 4 and 7, I think they are dependent on what kind of skills and power you have and are using. Eg: If you are long-term focused and have political influence climate issues might be a better thing to focus on than AI safety risks which is not really on the political agenda much.
Having looked at your sources I am not sure they justify the conclusions.
In particular:
Your sources for point 1 seem to ignore the >10% case that the world warms significantly more than expected (they generally look at mortality in the business as usual case).
Your sources for point 2 focus on whether climate change is truly existential, but do seem to point to a possibly if it being a global catastrophe. (Point 2 appears to be somewhat crucial, the other points, especially 1, 4, 5, 7 depend on this point.)
It seems plausible from looking at your sources that there are tail risks of extreme warming that could lead to huge global catastrophe (maybe not quite at your cut-off the 10% chance of 10% mortality level but huge).
Eg Halstead:
”On current pledges and promises, we’ll probably end up at around 700ppm by 2100 and increasing well beyond that.”
“at 700ppm, … there is an 11% chance of an eventual >6 degrees of warming”
”at 1120ppm, there is between a 10% and 34% chance of >9 degrees of warming”
“Heat stress … seems like it would be a serious problem for warming >6 degrees for large portions of the planet … With 11–12 °C warming, such regions would spread to encompass the majority of the human population as currently distributed”
”6 degrees would drastically change the face of the globe, with multi-metre sea level rises, massive coastal flooding, and the uninhabitability of the tropics.” “10 degrees … would be extremely bad”
Overall I expect these points 1 and 2 are quite possibly correct, but, having looked through your sources and concluded that they do not justify the points very well, I would have low epistemic status in these points.
Also on points 4 and 7, I think they are dependent on what kind of skills and power you have and are using. Eg: If you are long-term focused and have political influence climate issues might be a better thing to focus on than AI safety risks which is not really on the political agenda much.