So if by measuring sentience, you mean to ask ‘is X sentient?’ and mean something like ‘does X experience at least something, anything at all?’, then one view is that it’s a binary property. All or nothing − 0 or 1 - the lights are either on or off, so to speak. I think this is why David scare-quoted “less sentience”.
In this case, we might be able to speak of probabilities of X being sentient or not sentient. I find that Brian Tomasik’s idea of a ‘sentience classifier’ (replace the moral value output with P(X is sentient)) to be a useful way of thinking about this.
There is a view that there could be fractional experiences. I myself don’t understand how that would be possible but there is speculation about it.
However, if by sentience you instead mean intensity of experiences etc. (given that at least something is experienced) then the 2020 moral weight series is a good starting point. I personally disagree with some of their conclusions but it’s a really good framework for thinking about it.
My own view, which is similar if not identical to Tomasik’s, is that when thinking about any of these, it’s ultimately ‘up to you to decide’. I would add that human intuitions weren’t evolved to find the ground truth about what is sentient and the content of such experiences. Instead, they developed due to the need to analyze the behavior of other agents. Our intuitions could be exactly right or entirely wrong or somewhere in between but there’s no way to really know because we aren’t other minds.
So if by measuring sentience, you mean to ask ‘is X sentient?’ and mean something like ‘does X experience at least something, anything at all?’, then one view is that it’s a binary property. All or nothing − 0 or 1 - the lights are either on or off, so to speak. I think this is why David scare-quoted “less sentience”.
In this case, we might be able to speak of probabilities of X being sentient or not sentient. I find that Brian Tomasik’s idea of a ‘sentience classifier’ (replace the moral value output with P(X is sentient)) to be a useful way of thinking about this.
There is a view that there could be fractional experiences. I myself don’t understand how that would be possible but there is speculation about it.
However, if by sentience you instead mean intensity of experiences etc. (given that at least something is experienced) then the 2020 moral weight series is a good starting point. I personally disagree with some of their conclusions but it’s a really good framework for thinking about it.
My own view, which is similar if not identical to Tomasik’s, is that when thinking about any of these, it’s ultimately ‘up to you to decide’. I would add that human intuitions weren’t evolved to find the ground truth about what is sentient and the content of such experiences. Instead, they developed due to the need to analyze the behavior of other agents. Our intuitions could be exactly right or entirely wrong or somewhere in between but there’s no way to really know because we aren’t other minds.