I think it’s very unclear for sure. Why could consciousness not be like water that you could have more or less volume of? When I was a child I was perhaps conscious but less so than now?
Could a different species with a different brain structure could have a different “nature” of consciousness while not necessarily being more or less?
I agree it’s very unclear, but there could be directionality unless I’m missing some of the point of the concept...
I’m not saying it’s impossible to make sense of the idea of a metric of “how conscious” something is, just that it’s unclear enough what this means that any claim employing the notion without explanation is not “commonsense”.
Also part (although not all) of the attraction of “more neurons=more consciousness” is I think a picture that comes from “more input=more of a physical stuff”, which is wrong in this case. I actually do (tentatively!) think that consciousness is sort of a cluster-y concept, where the more of a range of properties a mind has, the more true* it is to say it is conscious, but none of those properties definitively is “really” what being conscious requires. (i.e. sensory input into rational belief, ability to recognize your own sensory states, some sort of raw complexity requirement to rule out very simple systems with the previous 2 features etc.) And I think larger neuron counts will rough correlate with having more of these sorts of properties. But I doubt this will lead to a view where something with a trillion neurons is a thousand times more conscious than something with a billion. *Degrees of truth are also highly philosophically controversial though.
I think it’s very unclear for sure. Why could consciousness not be like water that you could have more or less volume of? When I was a child I was perhaps conscious but less so than now?
Could a different species with a different brain structure could have a different “nature” of consciousness while not necessarily being more or less?
I agree it’s very unclear, but there could be directionality unless I’m missing some of the point of the concept...
I’m not saying it’s impossible to make sense of the idea of a metric of “how conscious” something is, just that it’s unclear enough what this means that any claim employing the notion without explanation is not “commonsense”.
100% agree nice one
Also part (although not all) of the attraction of “more neurons=more consciousness” is I think a picture that comes from “more input=more of a physical stuff”, which is wrong in this case. I actually do (tentatively!) think that consciousness is sort of a cluster-y concept, where the more of a range of properties a mind has, the more true* it is to say it is conscious, but none of those properties definitively is “really” what being conscious requires. (i.e. sensory input into rational belief, ability to recognize your own sensory states, some sort of raw complexity requirement to rule out very simple systems with the previous 2 features etc.) And I think larger neuron counts will rough correlate with having more of these sorts of properties. But I doubt this will lead to a view where something with a trillion neurons is a thousand times more conscious than something with a billion.
*Degrees of truth are also highly philosophically controversial though.