Why would research on ‘minor’ GCRs like the ones mentioned by Arepo be harder than eg AI alignment?
My impression is that there is plenty of good research on eg effects of CO2 on health, the Flynn effect and Kessler syndrome, and I would say its much higher quality than extant X risk research.
My point was just that understanding the expected impact seems more challenging. I’d agree that understanding the short-term impacts are much easier of those kinds of things, but it’s tricky to tell how that will impact things 200+ years from now.
Why would research on ‘minor’ GCRs like the ones mentioned by Arepo be harder than eg AI alignment?
My impression is that there is plenty of good research on eg effects of CO2 on health, the Flynn effect and Kessler syndrome, and I would say its much higher quality than extant X risk research.
Is the argument that they are less neglected?
My point was just that understanding the expected impact seems more challenging. I’d agree that understanding the short-term impacts are much easier of those kinds of things, but it’s tricky to tell how that will impact things 200+ years from now.