I don’t have time to write-up an argument for the claim now, but as a philosophy of mind PhD, I am extremely dubious that fewer neurons on its own means “less sentience” or less intense pains in any sense. Of course, lots of people think otherwise, and you should definitely give their views some weight in expectation, but I thought it was worth flagging that at least one person with some relevant expertise is highly skeptical of the assumption.
Also, those views are surprisingly common even though it doesn’t seem like there’s a lot of justification to be certain (or actually even, let’s say, ~>90% confident) of them—which a lot of people seem to be.
Humans making the claim that there’s “less sentience” or less intense pain in insects only have their mental experiences, combined with knowledge of their own nervous systems, as a reference.
I also lend significant25% credence to a similar view, which I’d summarize as “animals with fewer neurons aren’t less sentient, their sensations are just less complex”.
I used to think that babies had fewer neurons, but people would not believe babies to be lower on moral patienthood, but I was mistaken: apparently neurogenesis ends early (and adult neurogenesis is still in question):
neurogenesis in humans generally begins around gestational week (GW) 10 and ends around GW 25 with birth about GW 38-40.
However, perhaps we don’t just want to go by neuron count: number of synaptic connections seems perhaps just as important.
But that apparently didn’t make most humans to think that human babies are more sentient/more complex in experience. Actually, we pretty much thought the reverse. Untill mid 1980s, most doctors do not use anesthesia in operations for newborn human infants.
Yeah, I agree that it’s very uncertain. However, I was unsure how to quantitatively measure sentience with anything other than neuron count although my range should probably be wider. Do you have any other ideas on how to quantitatively approximate sentience? What do you think the range should be?
So if by measuring sentience, you mean to ask ‘is X sentient?’ and mean something like ‘does X experience at least something, anything at all?’, then one view is that it’s a binary property. All or nothing − 0 or 1 - the lights are either on or off, so to speak. I think this is why David scare-quoted “less sentience”.
In this case, we might be able to speak of probabilities of X being sentient or not sentient. I find that Brian Tomasik’s idea of a ‘sentience classifier’ (replace the moral value output with P(X is sentient)) to be a useful way of thinking about this.
There is a view that there could be fractional experiences. I myself don’t understand how that would be possible but there is speculation about it.
However, if by sentience you instead mean intensity of experiences etc. (given that at least something is experienced) then the 2020 moral weight series is a good starting point. I personally disagree with some of their conclusions but it’s a really good framework for thinking about it.
My own view, which is similar if not identical to Tomasik’s, is that when thinking about any of these, it’s ultimately ‘up to you to decide’. I would add that human intuitions weren’t evolved to find the ground truth about what is sentient and the content of such experiences. Instead, they developed due to the need to analyze the behavior of other agents. Our intuitions could be exactly right or entirely wrong or somewhere in between but there’s no way to really know because we aren’t other minds.
I don’t have time to write-up an argument for the claim now, but as a philosophy of mind PhD, I am extremely dubious that fewer neurons on its own means “less sentience” or less intense pains in any sense. Of course, lots of people think otherwise, and you should definitely give their views some weight in expectation, but I thought it was worth flagging that at least one person with some relevant expertise is highly skeptical of the assumption.
Agreed.
Also, those views are surprisingly common even though it doesn’t seem like there’s a lot of justification to be certain (or actually even, let’s say, ~>90% confident) of them—which a lot of people seem to be.
Humans making the claim that there’s “less sentience” or less intense pain in insects only have their mental experiences, combined with knowledge of their own nervous systems, as a reference.
Some relevant posts:
https://reducing-suffering.org/is-brain-size-morally-relevant/
https://forum.effectivealtruism.org/posts/H7KMqMtqNifGYMDft/differences-in-the-intensity-of-valenced-experience-across
RP has some posts coming out on this soon that I imagine will cover some of your arguments: https://forum.effectivealtruism.org/s/y5n47MfgrKvTLE3pw
I also lend significant25% credence to a similar view, which I’d summarize as “animals with fewer neurons aren’t less sentient, their sensations are just less complex”.
I used to think that babies had fewer neurons, but people would not believe babies to be lower on moral patienthood, but I was mistaken: apparently neurogenesis ends early (and adult neurogenesis is still in question):
However, perhaps we don’t just want to go by neuron count: number of synaptic connections seems perhaps just as important.
Speaking of synaptic connection, there’s another problem: Human adults have less than the average human infants. Peter Huttenlocher “showed that synaptic density in the human cerebral cortex increases rapidly after birth, peaking at 1 to 2 years of age, at about 50% above adult levels . It drops sharply during adolescence then stabilizes in adulthood, with a slight possible decline late in life.”
But that apparently didn’t make most humans to think that human babies are more sentient/more complex in experience. Actually, we pretty much thought the reverse. Untill mid 1980s, most doctors do not use anesthesia in operations for newborn human infants.
Yeah, I agree that it’s very uncertain. However, I was unsure how to quantitatively measure sentience with anything other than neuron count although my range should probably be wider. Do you have any other ideas on how to quantitatively approximate sentience? What do you think the range should be?
So if by measuring sentience, you mean to ask ‘is X sentient?’ and mean something like ‘does X experience at least something, anything at all?’, then one view is that it’s a binary property. All or nothing − 0 or 1 - the lights are either on or off, so to speak. I think this is why David scare-quoted “less sentience”.
In this case, we might be able to speak of probabilities of X being sentient or not sentient. I find that Brian Tomasik’s idea of a ‘sentience classifier’ (replace the moral value output with P(X is sentient)) to be a useful way of thinking about this.
There is a view that there could be fractional experiences. I myself don’t understand how that would be possible but there is speculation about it.
However, if by sentience you instead mean intensity of experiences etc. (given that at least something is experienced) then the 2020 moral weight series is a good starting point. I personally disagree with some of their conclusions but it’s a really good framework for thinking about it.
My own view, which is similar if not identical to Tomasik’s, is that when thinking about any of these, it’s ultimately ‘up to you to decide’. I would add that human intuitions weren’t evolved to find the ground truth about what is sentient and the content of such experiences. Instead, they developed due to the need to analyze the behavior of other agents. Our intuitions could be exactly right or entirely wrong or somewhere in between but there’s no way to really know because we aren’t other minds.