‘There’s a common sense story of: more neurons → more compute power → more consciousness.’
I think it is very unclear what “more consciousness” even means. “Consciousness” isn’t “stuff” like water that you can have a greater weight or volume of.
Several influential EAs have suggested using neuron counts as rough proxies for animals’ relative moral weights. We challenge this suggestion.
We take the following ideas to be the strongest reasons in favor of a neuron count proxy:
neuron counts are correlated with intelligence and intelligence is correlated with moral weight,
additional neurons result in “more consciousness” or “more valenced consciousness,” and
increasing numbers of neurons are required to reach thresholds of minimal information capacity required for morally relevant cognitive abilities.
However:
in regards to intelligence, we can question boththe extent to which more neurons are correlated with intelligence and whether more intelligence in fact predicts greater moral weight;
many ways of arguing that more neurons results in more valenced consciousness seem incompatible with our current understanding of how the brain is likely to work; and
there is no straightforward empirical evidence or compelling conceptual arguments indicating that relative differences in neuron counts within or between species reliably predicts welfare relevant functional capacities.
Overall, we suggest that neuron counts should not be used as a sole proxy for moral weight, but cannot be dismissed entirely. Rather, neuron counts should be combined with other metrics in an overall weighted score that includes information about whether different species have welfare-relevant capacities.
I think it’s very unclear for sure. Why could consciousness not be like water that you could have more or less volume of? When I was a child I was perhaps conscious but less so than now?
Could a different species with a different brain structure could have a different “nature” of consciousness while not necessarily being more or less?
I agree it’s very unclear, but there could be directionality unless I’m missing some of the point of the concept...
I’m not saying it’s impossible to make sense of the idea of a metric of “how conscious” something is, just that it’s unclear enough what this means that any claim employing the notion without explanation is not “commonsense”.
Also part (although not all) of the attraction of “more neurons=more consciousness” is I think a picture that comes from “more input=more of a physical stuff”, which is wrong in this case. I actually do (tentatively!) think that consciousness is sort of a cluster-y concept, where the more of a range of properties a mind has, the more true* it is to say it is conscious, but none of those properties definitively is “really” what being conscious requires. (i.e. sensory input into rational belief, ability to recognize your own sensory states, some sort of raw complexity requirement to rule out very simple systems with the previous 2 features etc.) And I think larger neuron counts will rough correlate with having more of these sorts of properties. But I doubt this will lead to a view where something with a trillion neurons is a thousand times more conscious than something with a billion. *Degrees of truth are also highly philosophically controversial though.
‘There’s a common sense story of: more neurons → more compute power → more consciousness.’
I think it is very unclear what “more consciousness” even means. “Consciousness” isn’t “stuff” like water that you can have a greater weight or volume of.
Hi David,
Relatedly, readers may want to check Why Neuron Counts Shouldn’t Be Used as Proxies for Moral Weight. Here are the key takeaways:
I think it’s very unclear for sure. Why could consciousness not be like water that you could have more or less volume of? When I was a child I was perhaps conscious but less so than now?
Could a different species with a different brain structure could have a different “nature” of consciousness while not necessarily being more or less?
I agree it’s very unclear, but there could be directionality unless I’m missing some of the point of the concept...
I’m not saying it’s impossible to make sense of the idea of a metric of “how conscious” something is, just that it’s unclear enough what this means that any claim employing the notion without explanation is not “commonsense”.
100% agree nice one
Also part (although not all) of the attraction of “more neurons=more consciousness” is I think a picture that comes from “more input=more of a physical stuff”, which is wrong in this case. I actually do (tentatively!) think that consciousness is sort of a cluster-y concept, where the more of a range of properties a mind has, the more true* it is to say it is conscious, but none of those properties definitively is “really” what being conscious requires. (i.e. sensory input into rational belief, ability to recognize your own sensory states, some sort of raw complexity requirement to rule out very simple systems with the previous 2 features etc.) And I think larger neuron counts will rough correlate with having more of these sorts of properties. But I doubt this will lead to a view where something with a trillion neurons is a thousand times more conscious than something with a billion.
*Degrees of truth are also highly philosophically controversial though.