My read on this so far is that low estimates for P(doom|AGI) are either borne of ignorance of what the true difficulties in AI alignment are; stem from a lack of security mindset; or are a social phenomenon where people want to sound respectable and non-alarmist; as opposed to being based on any sound technical argument.
My read on this so far is that low estimates for P(doom|AGI) are either borne of ignorance of what the true difficulties in AI alignment are; stem from a lack of security mindset; or are a social phenomenon where people want to sound respectable and non-alarmist; as opposed to being based on any sound technical argument.