Like I said on the other thread, I don’t think other researches (at least not the ones cited by RP that I could access) are thinking much about quantifying pain let alone comparing the quantity to human pain. Instead they are making models for potential neural pathways that could help differentiate which animals might experience pain and which might not. They are mostly (to put it simply) asking a “Yes/No” question in which case I agree neuron count is probably irrelevant to the question of whether animals feel pain or not.
But IMO this means the fact they haven’t talked about neuron count isn’t much of a datapoint against neuron counts as a potential element of a comparative measure between humans and animals.
If there was a decent amount of research out there comparing moral weights outside the EA sister that dismissed neuron count, I would feel differently.
I’m not very convinced. At the very least, this absence of discussion should be a significant update against using neuron count as proxy of moral value right now.
Or at least until significant evidence has been provided that it can be a useful measure (of which I’d expect at a minimum acknowledgment by a number of top experts). Otherwise it’s akin to guessing.
Even if scientists are usually not very concerned by comparing the moral value of different beings, I’d expect that they’d still talk significantly about the number of neurons for other reasons. For instance, I’d expect that they would have formulated theories of consciousness that are based on the number of neurons, and where the experience ‘expands’ where there are more of them.
(I am not formulating it in a precise way but I hope you get the idea)
Should we evaluate the potential of neuron count as a proxy ? Yes.
Should we use it to make significant funding allocation decisions with literal life or death consequences based on it ? No. At least not from what I’ve seen.
What you really want to look at-I haven’t properly-is the literature on what determines pain intensity, and, relatedly, what makes pain feel bad rather than good, and what makes pain morally bad. That’ll tell you something about how friendly current theory is to “more neurons=more intense experiences”, even if the papers in the literature don’t specifically discuss whether that is true.
That is probably a good idea, although the burden of proof isn’t really on me here. It’s on the proponent of using neuron count as a proxy for moral weight. But it would be interesting if they did that, indeed.
Like I said on the other thread, I don’t think other researches (at least not the ones cited by RP that I could access) are thinking much about quantifying pain let alone comparing the quantity to human pain. Instead they are making models for potential neural pathways that could help differentiate which animals might experience pain and which might not. They are mostly (to put it simply) asking a “Yes/No” question in which case I agree neuron count is probably irrelevant to the question of whether animals feel pain or not.
But IMO this means the fact they haven’t talked about neuron count isn’t much of a datapoint against neuron counts as a potential element of a comparative measure between humans and animals.
If there was a decent amount of research out there comparing moral weights outside the EA sister that dismissed neuron count, I would feel differently.
I’m not very convinced. At the very least, this absence of discussion should be a significant update against using neuron count as proxy of moral value right now. Or at least until significant evidence has been provided that it can be a useful measure (of which I’d expect at a minimum acknowledgment by a number of top experts). Otherwise it’s akin to guessing.
Even if scientists are usually not very concerned by comparing the moral value of different beings, I’d expect that they’d still talk significantly about the number of neurons for other reasons. For instance, I’d expect that they would have formulated theories of consciousness that are based on the number of neurons, and where the experience ‘expands’ where there are more of them. (I am not formulating it in a precise way but I hope you get the idea)
Should we evaluate the potential of neuron count as a proxy ? Yes.
Should we use it to make significant funding allocation decisions with literal life or death consequences based on it ? No. At least not from what I’ve seen.
What you really want to look at-I haven’t properly-is the literature on what determines pain intensity, and, relatedly, what makes pain feel bad rather than good, and what makes pain morally bad. That’ll tell you something about how friendly current theory is to “more neurons=more intense experiences”, even if the papers in the literature don’t specifically discuss whether that is true.
That is probably a good idea, although the burden of proof isn’t really on me here. It’s on the proponent of using neuron count as a proxy for moral weight. But it would be interesting if they did that, indeed.