I can plausibly see such sensors for physical pain but not for emotional pain. Emotional pain is the far more potent teacher of what is valuable and what is not, what is important and what is not. Intelligence needs direction of this sort for learning.
So, can you build embodied AGI with emotional responses built in—that last like emotions and so are suitable teachers like emotions? Building in empathy (both for happiness and suffering) and the pain of disapproval to AGI would be crucial.
There are two ways to plausibly embody AGI.
as supervisors of dumb robot bodies, the AGI remotely controls a robot body, processing a portion or all of the robot’s sensor data.
as host of an AGI, the AGI’s hardware is resident in the robot body.
I can plausibly see such sensors for physical pain but not for emotional pain. Emotional pain is the far more potent teacher of what is valuable and what is not, what is important and what is not. Intelligence needs direction of this sort for learning.
So, can you build embodied AGI with emotional responses built in—that last like emotions and so are suitable teachers like emotions? Building in empathy (both for happiness and suffering) and the pain of disapproval to AGI would be crucial.