I’m not very convinced. At the very least, this absence of discussion should be a significant update against using neuron count as proxy of moral value right now.
Or at least until significant evidence has been provided that it can be a useful measure (of which I’d expect at a minimum acknowledgment by a number of top experts). Otherwise it’s akin to guessing.
Even if scientists are usually not very concerned by comparing the moral value of different beings, I’d expect that they’d still talk significantly about the number of neurons for other reasons. For instance, I’d expect that they would have formulated theories of consciousness that are based on the number of neurons, and where the experience ‘expands’ where there are more of them.
(I am not formulating it in a precise way but I hope you get the idea)
Should we evaluate the potential of neuron count as a proxy ? Yes.
Should we use it to make significant funding allocation decisions with literal life or death consequences based on it ? No. At least not from what I’ve seen.
What you really want to look at-I haven’t properly-is the literature on what determines pain intensity, and, relatedly, what makes pain feel bad rather than good, and what makes pain morally bad. That’ll tell you something about how friendly current theory is to “more neurons=more intense experiences”, even if the papers in the literature don’t specifically discuss whether that is true.
That is probably a good idea, although the burden of proof isn’t really on me here. It’s on the proponent of using neuron count as a proxy for moral weight. But it would be interesting if they did that, indeed.
I’m not very convinced. At the very least, this absence of discussion should be a significant update against using neuron count as proxy of moral value right now. Or at least until significant evidence has been provided that it can be a useful measure (of which I’d expect at a minimum acknowledgment by a number of top experts). Otherwise it’s akin to guessing.
Even if scientists are usually not very concerned by comparing the moral value of different beings, I’d expect that they’d still talk significantly about the number of neurons for other reasons. For instance, I’d expect that they would have formulated theories of consciousness that are based on the number of neurons, and where the experience ‘expands’ where there are more of them. (I am not formulating it in a precise way but I hope you get the idea)
Should we evaluate the potential of neuron count as a proxy ? Yes.
Should we use it to make significant funding allocation decisions with literal life or death consequences based on it ? No. At least not from what I’ve seen.
What you really want to look at-I haven’t properly-is the literature on what determines pain intensity, and, relatedly, what makes pain feel bad rather than good, and what makes pain morally bad. That’ll tell you something about how friendly current theory is to “more neurons=more intense experiences”, even if the papers in the literature don’t specifically discuss whether that is true.
That is probably a good idea, although the burden of proof isn’t really on me here. It’s on the proponent of using neuron count as a proxy for moral weight. But it would be interesting if they did that, indeed.