Regarding neuron weights, I came across an interesting discussion last week on a post discussing RPâs Moral Weight Projects. During this discussion, this comment by @David Mathersđ¸ says this (emphasis mine):
I am far from an unbiased party since I briefly worked on the moral weight project as a (paid) intern for a couple of months, but for what itâs worth, as a philosophy of consciousness PhD, itâs not just that I, personally, from an inside point of view, think weighting by neuron count is bad idea, itâs that I canât think of any philosopher or scientist who maintains that âmore neurons make for more intense experiencesâ, or any philosophical or scientific theory of consciousness that clearly supports this. The only places Iâve ever encountered the view is EAs, usually without formal background in philosophy of mind or cognitive science, defending focusing near-termist EA money on humans. (Neuron count might correlate with other stuff we care about apart from experience intensity of course, and Iâm not a pure hedonist.)
For one thing, unless you are a mind-body dualist-and the majority of philosophers are not-it doesnât make sense to think of pain as like some sort of stuff/âsubstance like water or air, that the brain can produce more or less of. And I think that sort of picture lies behind the intuitive appeal of more neurons=more intense experiences.
I had a bit of a scan and I couldnât find [references to neuron count for moral weight] outside the EA sphere
I was surprised because, given the frequency at which neuron counts are discussed on the forum, it felt from the outside like a position that could have some academic credibility.
Personally, this makes me think that neuron count should not be considered among the most credible ways of comparing the moral weight of different species (unless further evidence arises, of course).
Of course, I understand discussing this while emphasising the speculative aspect (âwhat if neurons were actually important?â). But I think that saying: âHereâs the conclusion with RPâs moral weight projects, hereâs the conclusion with neuron countâ, gives way too much credibility to neuron counts as a measure, given the lack of evidence behind this position.
Like I said on the other thread, I donât think other researches (at least not the ones cited by RP that I could access) are thinking much about quantifying pain let alone comparing the quantity to human pain. Instead they are making models for potential neural pathways that could help differentiate which animals might experience pain and which might not. They are mostly (to put it simply) asking a âYes/âNoâ question in which case I agree neuron count is probably irrelevant to the question of whether animals feel pain or not.
But IMO this means the fact they havenât talked about neuron count isnât much of a datapoint against neuron counts as a potential element of a comparative measure between humans and animals.
If there was a decent amount of research out there comparing moral weights outside the EA sister that dismissed neuron count, I would feel differently.
Iâm not very convinced. At the very least, this absence of discussion should be a significant update against using neuron count as proxy of moral value right now.
Or at least until significant evidence has been provided that it can be a useful measure (of which Iâd expect at a minimum acknowledgment by a number of top experts). Otherwise itâs akin to guessing.
Even if scientists are usually not very concerned by comparing the moral value of different beings, Iâd expect that theyâd still talk significantly about the number of neurons for other reasons. For instance, Iâd expect that they would have formulated theories of consciousness that are based on the number of neurons, and where the experience âexpandsâ where there are more of them.
(I am not formulating it in a precise way but I hope you get the idea)
Should we evaluate the potential of neuron count as a proxy ? Yes.
Should we use it to make significant funding allocation decisions with literal life or death consequences based on it ? No. At least not from what Iâve seen.
What you really want to look at-I havenât properly-is the literature on what determines pain intensity, and, relatedly, what makes pain feel bad rather than good, and what makes pain morally bad. Thatâll tell you something about how friendly current theory is to âmore neurons=more intense experiencesâ, even if the papers in the literature donât specifically discuss whether that is true.
That is probably a good idea, although the burden of proof isnât really on me here. Itâs on the proponent of using neuron count as a proxy for moral weight. But it would be interesting if they did that, indeed.
This is an interesting post.
Regarding neuron weights, I came across an interesting discussion last week on a post discussing RPâs Moral Weight Projects. During this discussion, this comment by @David Mathersđ¸ says this (emphasis mine):
The author, @NickLaing, confirmed this:
I was surprised because, given the frequency at which neuron counts are discussed on the forum, it felt from the outside like a position that could have some academic credibility.
Personally, this makes me think that neuron count should not be considered among the most credible ways of comparing the moral weight of different species (unless further evidence arises, of course).
Of course, I understand discussing this while emphasising the speculative aspect (âwhat if neurons were actually important?â). But I think that saying: âHereâs the conclusion with RPâs moral weight projects, hereâs the conclusion with neuron countâ, gives way too much credibility to neuron counts as a measure, given the lack of evidence behind this position.
Like I said on the other thread, I donât think other researches (at least not the ones cited by RP that I could access) are thinking much about quantifying pain let alone comparing the quantity to human pain. Instead they are making models for potential neural pathways that could help differentiate which animals might experience pain and which might not. They are mostly (to put it simply) asking a âYes/âNoâ question in which case I agree neuron count is probably irrelevant to the question of whether animals feel pain or not.
But IMO this means the fact they havenât talked about neuron count isnât much of a datapoint against neuron counts as a potential element of a comparative measure between humans and animals.
If there was a decent amount of research out there comparing moral weights outside the EA sister that dismissed neuron count, I would feel differently.
Iâm not very convinced. At the very least, this absence of discussion should be a significant update against using neuron count as proxy of moral value right now. Or at least until significant evidence has been provided that it can be a useful measure (of which Iâd expect at a minimum acknowledgment by a number of top experts). Otherwise itâs akin to guessing.
Even if scientists are usually not very concerned by comparing the moral value of different beings, Iâd expect that theyâd still talk significantly about the number of neurons for other reasons. For instance, Iâd expect that they would have formulated theories of consciousness that are based on the number of neurons, and where the experience âexpandsâ where there are more of them. (I am not formulating it in a precise way but I hope you get the idea)
Should we evaluate the potential of neuron count as a proxy ? Yes.
Should we use it to make significant funding allocation decisions with literal life or death consequences based on it ? No. At least not from what Iâve seen.
What you really want to look at-I havenât properly-is the literature on what determines pain intensity, and, relatedly, what makes pain feel bad rather than good, and what makes pain morally bad. Thatâll tell you something about how friendly current theory is to âmore neurons=more intense experiencesâ, even if the papers in the literature donât specifically discuss whether that is true.
That is probably a good idea, although the burden of proof isnât really on me here. Itâs on the proponent of using neuron count as a proxy for moral weight. But it would be interesting if they did that, indeed.