The more or larger such changes are necessary to get from one brain to another, the less tight the bounds on the comparisons could become, the further they may go both negative and positive overall,[2] and the less reasonable it seems to make such comparisons at all.
I agree comparisons become increasingly uncertain as the difference between the states of the organisms increases. However, I do not think there is a point where comparisons go from possible, but extremely difficult to not possible at all. I would say there is just a progressive widening of the distribution representing the hedonistic welfare per unit time of a given state of an organism as it moves away from typical human states. As an example, I could say my hedonistic welfare right now is 0.5 to 1.5 times that of random human who is awake, whereas that of a random nematode might be 10^-17 to 1 times that of a random human who is awake. I estimate the ratio between the individual number of neurons of nematodes and humans is 2.79*10^-9, whose square is 7.78*10^-18, roughly 10^-17.
On the nematode example, it could go further than that: we might assign an imprecise credence between X and 100% to a set of standards for sentience that nematodes don’t meet (see my other post on gradations of moral weight). So, the ratio could be anywhere between 0 and 1 (assuming we’re taking the absolute value, or only consider same-sign valence).
If the ratio is anywhere between 0 and 1, then whenever we’re looking at affecting nematode-seconds relative to their welfare ranges more than human-seconds relative to our welfare ranges, it would be indeterminate which is affected more. I think that would be every time in practice.
If we don’t need to deal with gradations/​vagueness like this, then I would probably assign expected welfare ranges (conditional on sentience) between constant and roughly proportional to the number of neurons, and this could give many more practically useful comparisons. EDIT: although conscious subsystems makes me more inclined towards approximately proportional, if we’re entertaining nematode sentience.
Thanks for the post, Michael.
I agree comparisons become increasingly uncertain as the difference between the states of the organisms increases. However, I do not think there is a point where comparisons go from possible, but extremely difficult to not possible at all. I would say there is just a progressive widening of the distribution representing the hedonistic welfare per unit time of a given state of an organism as it moves away from typical human states. As an example, I could say my hedonistic welfare right now is 0.5 to 1.5 times that of random human who is awake, whereas that of a random nematode might be 10^-17 to 1 times that of a random human who is awake. I estimate the ratio between the individual number of neurons of nematodes and humans is 2.79*10^-9, whose square is 7.78*10^-18, roughly 10^-17.
On the nematode example, it could go further than that: we might assign an imprecise credence between X and 100% to a set of standards for sentience that nematodes don’t meet (see my other post on gradations of moral weight). So, the ratio could be anywhere between 0 and 1 (assuming we’re taking the absolute value, or only consider same-sign valence).
If the ratio is anywhere between 0 and 1, then whenever we’re looking at affecting nematode-seconds relative to their welfare ranges more than human-seconds relative to our welfare ranges, it would be indeterminate which is affected more. I think that would be every time in practice.
If we don’t need to deal with gradations/​vagueness like this, then I would probably assign expected welfare ranges (conditional on sentience) between constant and roughly proportional to the number of neurons, and this could give many more practically useful comparisons. EDIT: although conscious subsystems makes me more inclined towards approximately proportional, if we’re entertaining nematode sentience.
I would be curious to know your thoughts on this discussion between me and Anthony DiGiovanni about imprecise expected values.