Why? As per instrumental convergence, any advanced AI is likely to have self-preservation and a negative reward signal it would receive upon a violation of such drive would be functionally very similar to pain (give or take the bodily component, but I don’t think it’s required? Otherwise simulate a million human minds in agony is OK, and I assume we agree it’s not). Likewise, any system with goal-directed agentic behavior would experience some reward from moving towards its goals, which seems functionally very similar to pleasure (or satisfaction or something along these lines).
I just think anguish is more likely than physical pain. I suppose there could be physical pain in a distributed system as a result of certain nodes going down.
It’s actually not obvious to me that simulations of humans could have physical pain. Seems possible, but maybe only other orders of pain like anguish and frustration are possible.
>It’s hard to imagine AI systems having this
Why? As per instrumental convergence, any advanced AI is likely to have self-preservation and a negative reward signal it would receive upon a violation of such drive would be functionally very similar to pain (give or take the bodily component, but I don’t think it’s required? Otherwise simulate a million human minds in agony is OK, and I assume we agree it’s not). Likewise, any system with goal-directed agentic behavior would experience some reward from moving towards its goals, which seems functionally very similar to pleasure (or satisfaction or something along these lines).
I just think anguish is more likely than physical pain. I suppose there could be physical pain in a distributed system as a result of certain nodes going down.
It’s actually not obvious to me that simulations of humans could have physical pain. Seems possible, but maybe only other orders of pain like anguish and frustration are possible.