An uninformed prior says all individuals have equal moral weight
Thatās one way of constructing an uninformed prior, but that seems quite a bit worse than starting from a place of equal moral weight among cells, or perhaps atoms or neurons. All of which would give less animal friendly results, though still more animal-friendly results than mainstream human morality.
(And of course this is just a prior, and our experience of the world can bring us quite a long way from whichever prior we think is most natural.)
Cells, atoms and neurons arenāt conscious entities in themselves. I see no principled reason for going to that level for an uninformed prior.
A trueuninformed prior would probably say āI have no ideaā but if weāre going to have some idea it seems more natural to start at all sentient individuals having equal weight. The individual is the level at which conscious experience happens, not the cell/āatom/āneuron.
Why do you think the individual is the level at which conscious experience happens?
(I tend to imagine that it happens at a range of scales, including both smaller-than-individual and bigger-than-individual. I donāt see why we should generalise from our experience to the idea that individual organisms are the right boundary to draw. I put some reasonable weight on some small degree of consciousness occurring at very small levels like neurons, although thatās more like āmy intuitive concept of consciousness wasnāt expansive enough, and the correct concept extends hereā).
To be honest Iām not very well-read on theories of consciousness.
I donāt see why we should generalise from our experience to the idea that individual organisms are the right boundary to draw.
For an uninformed prior that isnāt āI have no ideaā (and I suppose you could say Iām uninformed myself!) I donāt think we have much of an option but to generalise from experience. Being able to say it might happen at other levels seems a bit too āinformedā to me.
IDK, structurally your argument here reminds me of arguments that we shouldnāt assume animals are conscious, since we can only generalise from human experiences. (In both cases I feel like thereās not nothing to the argument, but Iām overall pretty uncompelled.)
How far and how to generalize for an uninformed prior is pretty unclear. I could say just generalize to other human males because I canāt experience being female. I could say generalize to other humans because I canāt experience being another species. I could say generalize to only living things because I canāt experience not being a living thing.
If youāre truly uniformed I donāt think you can really generalize at all. But in my current relatively uninformed state I generalize to those that are biologically similar to humans (e.g. central nervous system) as Iām aware of research about the importance of this type of biology within humans for elements of consciousness. I also generalize to other entities that act in a similar way to me when in supposed pain (try to avoid it, cry out, bleed annd become less physically capable etc.).
I donāt think you should give 0 probability to individual cells being conscious, because then no evidence or argument could move you away from that, if youāre a committed Bayesian. I donāt know what an uninformed prior could look like. I imagine there isnāt one. Itās the reference class problem.
You should even be uncertain about the fundamental nature of reality. Maybe things more basic than fundamental particles, like strings. Or maybe something else. They could be conscious or not, and they may not exist at all.
I certainly donāt put 0 probability on that possibility.
I agree uninformed prior may not be a useful concept here. I think the true uninformed prior is āI have no idea what is conscious other than myselfā.
Yeah if I were to translate that into a quantitative prior I suppose it would be that other individuals have roughly 50% of being conscious (I.e. Iām agnostic on if they are or not).
Then I learn about the world. I learn about the importance of certain biological structures for consciousness. I learn that I act in a certain way when in pain and notice other individuals do as well etc. Thatās how I get my posterior that rocks probably arenāt conscious and pigs probably are.
What do you count as āother individualā? Any physical system, including overlapping ones? What about your brain, and your brain but not counting one electron?
Iām a bit confused if Iām supposed to be answering on the basis of my uninformed prior or some slightly informed prior or even my posterior here. Like Iām not sure how much you want me to answer based on my experience of the world.
For an uninformed prior I suppose any individual entity that I can visually see. I see a rock and I think āthat could possibly be consciousā. I donāt lump the rock with another nearby rock and think maybe that ādouble rockā is conscious because they just visually appear to me to be independent entities as they are not really visually connected in any physical way. This obviously does factor in some knowledge of the world so I suppose it isnāt a strict uninformed prior, but I suppose itās about as uninformed as is useful to talk about?
Thatās one way of constructing an uninformed prior, but that seems quite a bit worse than starting from a place of equal moral weight among cells, or perhaps atoms or neurons. All of which would give less animal friendly results, though still more animal-friendly results than mainstream human morality.
(And of course this is just a prior, and our experience of the world can bring us quite a long way from whichever prior we think is most natural.)
Cells, atoms and neurons arenāt conscious entities in themselves. I see no principled reason for going to that level for an uninformed prior.
A true uninformed prior would probably say āI have no ideaā but if weāre going to have some idea it seems more natural to start at all sentient individuals having equal weight. The individual is the level at which conscious experience happens, not the cell/āatom/āneuron.
Why do you think the individual is the level at which conscious experience happens?
(I tend to imagine that it happens at a range of scales, including both smaller-than-individual and bigger-than-individual. I donāt see why we should generalise from our experience to the idea that individual organisms are the right boundary to draw. I put some reasonable weight on some small degree of consciousness occurring at very small levels like neurons, although thatās more like āmy intuitive concept of consciousness wasnāt expansive enough, and the correct concept extends hereā).
To be honest Iām not very well-read on theories of consciousness.
For an uninformed prior that isnāt āI have no ideaā (and I suppose you could say Iām uninformed myself!) I donāt think we have much of an option but to generalise from experience. Being able to say it might happen at other levels seems a bit too āinformedā to me.
IDK, structurally your argument here reminds me of arguments that we shouldnāt assume animals are conscious, since we can only generalise from human experiences. (In both cases I feel like thereās not nothing to the argument, but Iām overall pretty uncompelled.)
How far and how to generalize for an uninformed prior is pretty unclear. I could say just generalize to other human males because I canāt experience being female. I could say generalize to other humans because I canāt experience being another species. I could say generalize to only living things because I canāt experience not being a living thing.
If youāre truly uniformed I donāt think you can really generalize at all. But in my current relatively uninformed state I generalize to those that are biologically similar to humans (e.g. central nervous system) as Iām aware of research about the importance of this type of biology within humans for elements of consciousness. I also generalize to other entities that act in a similar way to me when in supposed pain (try to avoid it, cry out, bleed annd become less physically capable etc.).
I donāt think you should give 0 probability to individual cells being conscious, because then no evidence or argument could move you away from that, if youāre a committed Bayesian. I donāt know what an uninformed prior could look like. I imagine there isnāt one. Itās the reference class problem.
You should even be uncertain about the fundamental nature of reality. Maybe things more basic than fundamental particles, like strings. Or maybe something else. They could be conscious or not, and they may not exist at all.
I certainly donāt put 0 probability on that possibility.
I agree uninformed prior may not be a useful concept here. I think the true uninformed prior is āI have no idea what is conscious other than myselfā.
I donāt think that gives you can actual proper quantitative prior, as a probability distribution.
Yeah if I were to translate that into a quantitative prior I suppose it would be that other individuals have roughly 50% of being conscious (I.e. Iām agnostic on if they are or not).
Then I learn about the world. I learn about the importance of certain biological structures for consciousness. I learn that I act in a certain way when in pain and notice other individuals do as well etc. Thatās how I get my posterior that rocks probably arenāt conscious and pigs probably are.
Ok, this makes more sense.
What do you count as āother individualā? Any physical system, including overlapping ones? What about your brain, and your brain but not counting one electron?
Iām a bit confused if Iām supposed to be answering on the basis of my uninformed prior or some slightly informed prior or even my posterior here. Like Iām not sure how much you want me to answer based on my experience of the world.
For an uninformed prior I suppose any individual entity that I can visually see. I see a rock and I think āthat could possibly be consciousā. I donāt lump the rock with another nearby rock and think maybe that ādouble rockā is conscious because they just visually appear to me to be independent entities as they are not really visually connected in any physical way. This obviously does factor in some knowledge of the world so I suppose it isnāt a strict uninformed prior, but I suppose itās about as uninformed as is useful to talk about?