I don’t think I’ve met people working on AGI who has P(doom) >50%. I think I fairly often talk to people at e.g. OpenAI or DeepMind who believe it’s 0.1%-10% however. And again, I don’t find the difference that morally significant between probabilistically killing people at 5% vs 50% is that significant.
I don’t know how useful it is to conceptualize AI engineers who actively believe >50% P(doom) as evil or “low-lifes”, while giving a pass to people who have lower probabilities of doom. My guess is that it isn’t, and it would be better if we have an honest perspective overall. Relatedly, it’s better for people to be able to honestly admit “many people will see my work as evil but I’m doing it for xyz reasons anyway” rather than delude themselves otherwise and come up with increasingly implausible analogies, or refuse to engage at all.
I think it is much more likely that these MIRI folks have worked themselves into a corner of an echo chamber than it is that our field has attracted so many low-lifes who would sooner kill every last human than walk away from a job.
I agree this is a confusing situation. My guess is most people compartmentalize and/or don’t think of what they’re doing as that critical to advancing the doomsday machine, and/or they think other people will get there first and/or they think AGI is so far away that current efforts don’t matter, etc.
I would bet that most people who work in petroleum companies[1] (and for that matter, consumers) don’t think regularly about their consequences on climate change, marketers for tobacco companies don’t think about their impacts on lung cancer, Google engineers at Project Maven don’t think too hard about how their work accelerates drone warfare, etc. I quite like the bookThank You for Smokingfor some of this mentality.
Of course probabilistically “killing all of humanity” is axiologically worse in scope than causing lung cancer or civilian casualties of drones or arguably marginal effects on climate change. But scope neglect is a well-known problem with human psychology, and we shouldn’t be too surprised that people’s psychology is not extremely sensitive to magnitude.
I don’t think I’ve met people working on AGI who has P(doom) >50%. I think I fairly often talk to people at e.g. OpenAI or DeepMind who believe it’s 0.1%-10% however. And again, I don’t find the difference that morally significant between probabilistically killing people at 5% vs 50% is that significant.
I don’t know how useful it is to conceptualize AI engineers who actively believe >50% P(doom) as evil or “low-lifes”, while giving a pass to people who have lower probabilities of doom. My guess is that it isn’t, and it would be better if we have an honest perspective overall. Relatedly, it’s better for people to be able to honestly admit “many people will see my work as evil but I’m doing it for xyz reasons anyway” rather than delude themselves otherwise and come up with increasingly implausible analogies, or refuse to engage at all.
I agree this is a confusing situation. My guess is most people compartmentalize and/or don’t think of what they’re doing as that critical to advancing the doomsday machine, and/or they think other people will get there first and/or they think AGI is so far away that current efforts don’t matter, etc.
I would bet that most people who work in petroleum companies[1] (and for that matter, consumers) don’t think regularly about their consequences on climate change, marketers for tobacco companies don’t think about their impacts on lung cancer, Google engineers at Project Maven don’t think too hard about how their work accelerates drone warfare, etc. I quite like the bookThank You for Smoking for some of this mentality.
Of course probabilistically “killing all of humanity” is axiologically worse in scope than causing lung cancer or civilian casualties of drones or arguably marginal effects on climate change. But scope neglect is a well-known problem with human psychology, and we shouldn’t be too surprised that people’s psychology is not extremely sensitive to magnitude.
For the record, I’m not pure here and I in fact do fly.