Not entirely clear why the sadistic robots would do such a thing.
One thing I liked about the novel ‘Surface Detail’ was that the sadists imposing the suffering had at least some kind of semi-plausible religious rationale for what they were doing—which makes the whole scenario more psychologically plausible and therefore all the more terrifying.
Yeah, I agree it’s not clear why they’d do it. I give the comic writer some slack though, since it’s hard to fit that much into a comic.
Couple reasons that I can think of off the top of my head that that could happen:
Sign flip. Accidentally flip the sign and instead of trying to maximize human flourishing, it’s trying to minimize it.
Punishment. Imagine a dictator created TAI and was using it to punish people that fit a certain demographics (e.g. Uyghurs). Imagine that the human there is a Uyghur, or that they failed to sufficiently specify the demographic and it started doing everybody or large swathes of the world
Honestly though, I think the most probable s-risks are the incidental ones (covered in Tobias’s book and also this blog post here). Basically, something where suffering is a side-product, like factory farming or slavery. I also put highest odds it would be for digital minds, since I think the future will be predominantly digital minds.
But it’d be very hard to make a comic about digital minds that would be emotionally compelling, which is why I like the comic (although “like” is a bit of a strong word. More, “found incredibly psychologically scarring but in a way that helps me remember what I’m fighting for”)
Oh no. Very horrifying!
Not entirely clear why the sadistic robots would do such a thing.
One thing I liked about the novel ‘Surface Detail’ was that the sadists imposing the suffering had at least some kind of semi-plausible religious rationale for what they were doing—which makes the whole scenario more psychologically plausible and therefore all the more terrifying.
Yeah, I agree it’s not clear why they’d do it. I give the comic writer some slack though, since it’s hard to fit that much into a comic.
Couple reasons that I can think of off the top of my head that that could happen:
Sign flip. Accidentally flip the sign and instead of trying to maximize human flourishing, it’s trying to minimize it.
Punishment. Imagine a dictator created TAI and was using it to punish people that fit a certain demographics (e.g. Uyghurs). Imagine that the human there is a Uyghur, or that they failed to sufficiently specify the demographic and it started doing everybody or large swathes of the world
Honestly though, I think the most probable s-risks are the incidental ones (covered in Tobias’s book and also this blog post here). Basically, something where suffering is a side-product, like factory farming or slavery. I also put highest odds it would be for digital minds, since I think the future will be predominantly digital minds.
But it’d be very hard to make a comic about digital minds that would be emotionally compelling, which is why I like the comic (although “like” is a bit of a strong word. More, “found incredibly psychologically scarring but in a way that helps me remember what I’m fighting for”)