I, for one, think that it is good that climate change was not mentioned. Not necessarily because there are no analogies and lessons to be drawn, but rather because it can more easily be misinterpreted. I think that the kind of actions and risks are much more similar to bio and nuclear, in that there are way less actors and, at least for now, it is much less integrated to day-to-day life. Moreover, in many scenarios, the risk itself is of more abrupt and binary nature (though of course not completely so), rather than a very long and gradual process. I’d be worried that comparing AI safety to climate change would be easily misinterpreted or dismissed by irrelevant claims.
I, for one, think that it is good that climate change was not mentioned. Not necessarily because there are no analogies and lessons to be drawn, but rather because it can more easily be misinterpreted. I think that the kind of actions and risks are much more similar to bio and nuclear, in that there are way less actors and, at least for now, it is much less integrated to day-to-day life. Moreover, in many scenarios, the risk itself is of more abrupt and binary nature (though of course not completely so), rather than a very long and gradual process. I’d be worried that comparing AI safety to climate change would be easily misinterpreted or dismissed by irrelevant claims.