Perhaps I oversold the provocative title. But I do think that affective experiences are much harder, so even if there is a conscious AI it is unlikely to have the sorts of morally significant states we care about. While I think that it is plausible that current theories of consciousness might be relatively close to complete, I’m less sympathetic that current theories of valence are plausible as relatively complete accounts. There has been much less work in this direction.
There is a growing amount of work in philosophy investigating the basic nature of pain that seems relevant to identifying important valenced experiences in software entities. What the body commands by Colin Klein is a representative and reasonably accessible book-length introduction that pitches one of the current major theories of pain. Applying it to conscious software entities wouldn’t be too hard. Otherwise, my impression is that most of the work is too recent and too niche to have accessible surveys yet.
Overall, I should say that not particularly sympathetic to the theories that people have come up with here, but you might disagree and I don’t think you have much reason to take my word for it. In any case they are trying to answer the right questions.
Wow, that is a strong claim!
Could these conscious AI also have affective experience?
Perhaps I oversold the provocative title. But I do think that affective experiences are much harder, so even if there is a conscious AI it is unlikely to have the sorts of morally significant states we care about. While I think that it is plausible that current theories of consciousness might be relatively close to complete, I’m less sympathetic that current theories of valence are plausible as relatively complete accounts. There has been much less work in this direction.
Which makes me wonder how anyone expects to identify whether software entities have affective experience.
Is there any work in this direction that you like and can recommend?
There is a growing amount of work in philosophy investigating the basic nature of pain that seems relevant to identifying important valenced experiences in software entities. What the body commands by Colin Klein is a representative and reasonably accessible book-length introduction that pitches one of the current major theories of pain. Applying it to conscious software entities wouldn’t be too hard. Otherwise, my impression is that most of the work is too recent and too niche to have accessible surveys yet.
Overall, I should say that not particularly sympathetic to the theories that people have come up with here, but you might disagree and I don’t think you have much reason to take my word for it. In any case they are trying to answer the right questions.