Could you link the most relevant piece you are aware of? What do you mean by “independently”? Under hedonism, I think the probability of consciousness only matters to the extent it informs the probability of valences experiences.
The idea is more aspirational. I’m not really sure of what to recommend in the field, but this is a pretty good overview: https://arxiv.org/pdf/2404.16696
Interesting! How?
Perhaps valence requires something like the assignment of weights to alternative possibilities. If you can look inside the AI and confirm that it is making decisions in a different way, you can conclude that it doesn’t have valenced experiences. Valence plausibly requires such assignments of weights (most likely with a bunch of other constraints), and the absence of one requirement is enough to disconfirm something. Of course, this sort of requirement is likely to be controversial, but it is less open to radically different views than consciousness itself.
Yeah, that’s right. Some kinds of mitigation will increase risks later (e.g. a pause), and the model doesn’t accommodate such nuance.