Interesting! I intended the post largely as a response to someone with views like yours. In short, I think the considerations I provided based on how animals behave is very well explained by the supposition that they’re conscious. I also find RP’s arguments against neuron counts completely devastating.
RP had some arguments against conscious subsystems affecting moral weight very significantly that I found pretty convincing.
In regards to your first point, I don’t see either why we’d think that degree of attention correlates with neuron counts or determines the intensity of consciousness
RP had some arguments against conscious subsystems affecting moral weight very significantly that I found pretty convincing.
I might have written some of them! I still have some sympathy for the hypothesis and that it matters when you reason using expected values, taking the arguments into account, even if you assign the hypothesis like 1% probability. The probabilities can matter here.
In regards to your first point, I don’t see either why we’d think that degree of attention correlates with neuron counts or determines the intensity of consciousness
I believe the intensity of suffering consists largely (maybe not exclusively) in how much it pulls your attention, specifically its motivational salience. Intense suffering that’s easy to ignore seems like an oxymoron. I discuss this a bit more here.
(...) Sufferers can ignore this sensation most of the time. Performance of cognitive tasks demanding attention are either not affected or only mildly affected. (...)
Hurtful pain:
(...) Different from Annoying pain, the ability to draw attention away from the sensation of pain is reduced: awareness of pain is likely to be present most of the time, interspersed by brief periods during which pain can be ignored depending on the level of distraction provided by other activities. (...)
Disabling pain:
(...) Inattention and unresponsiveness to milder forms of pain or other ongoing stimuli and surroundings is likely to be observed. (...)
Excruciating pain seems entirely behaviourally defined, but I would assume effects on attention like disabling pain or (much) stronger.
Then, we can ask “how much attention can be pulled?” And we might think:
having more things you’re aware of simultaneously (e.g. more details in your visual field) means you have more attention to pull, and
more neurons allows you to be aware of more things simultaneously,
so brains with more neurons can have more attention to pull.
I don’t think this is right. We could imagine a very simple creature experience very little pain but be totally focused on it. It’s true that normally for creatures like us, we tend to focus more on more intense pain, but this doesn’t mean that’s the relevant benchmark for intensity. My claim is the causal arrow goes the other way.
But if I did, I think this would make me think animal consciousness is even more serious. For simple creatures, pain takes up their whole world.
Maybe it’ll help for me to rephrase: if a being has more things it can attend to (be aware of, have in its attention) simultaneously, then it has more attention to pull. It can attend to more, all else equal, for example, if it has a richer/more detailed visual field, similar to more pixels in a computer screen.
We could imagine a very simple creature experience very little pain but be totally focused on it.
If it’s very simple, it would probably have very little attention to pull (relatively), so the pain would not be intense under the hypothesis I’m putting forward.
But if I did, I think this would make me think animal consciousness is even more serious. For simple creatures, pain takes up their whole world.
I also give some weight to this possibility, i.e. that we should measure attention in individual-relative terms, and it’s something more like the proportion of attention pulled that matters.
Interesting! I intended the post largely as a response to someone with views like yours. In short, I think the considerations I provided based on how animals behave is very well explained by the supposition that they’re conscious. I also find RP’s arguments against neuron counts completely devastating.
I worked on some of them with RP myself here.
FWIW, I found Adam’s arguments convincing against the kinds of views he argued against, but I don’t think they covered the cases in point 2 here.
RP had some arguments against conscious subsystems affecting moral weight very significantly that I found pretty convincing.
In regards to your first point, I don’t see either why we’d think that degree of attention correlates with neuron counts or determines the intensity of consciousness
I might have written some of them! I still have some sympathy for the hypothesis and that it matters when you reason using expected values, taking the arguments into account, even if you assign the hypothesis like 1% probability. The probabilities can matter here.
I believe the intensity of suffering consists largely (maybe not exclusively) in how much it pulls your attention, specifically its motivational salience. Intense suffering that’s easy to ignore seems like an oxymoron. I discuss this a bit more here.
Welfare Footprint Project’s pain definitions also refer to attention as one of the criteria (along with other behaviours):
Annoying pain:
Hurtful pain:
Disabling pain:
Excruciating pain seems entirely behaviourally defined, but I would assume effects on attention like disabling pain or (much) stronger.
Then, we can ask “how much attention can be pulled?” And we might think:
having more things you’re aware of simultaneously (e.g. more details in your visual field) means you have more attention to pull, and
more neurons allows you to be aware of more things simultaneously,
so brains with more neurons can have more attention to pull.
I don’t think this is right. We could imagine a very simple creature experience very little pain but be totally focused on it. It’s true that normally for creatures like us, we tend to focus more on more intense pain, but this doesn’t mean that’s the relevant benchmark for intensity. My claim is the causal arrow goes the other way.
But if I did, I think this would make me think animal consciousness is even more serious. For simple creatures, pain takes up their whole world.
Maybe it’ll help for me to rephrase: if a being has more things it can attend to (be aware of, have in its attention) simultaneously, then it has more attention to pull. It can attend to more, all else equal, for example, if it has a richer/more detailed visual field, similar to more pixels in a computer screen.
If it’s very simple, it would probably have very little attention to pull (relatively), so the pain would not be intense under the hypothesis I’m putting forward.
I also give some weight to this possibility, i.e. that we should measure attention in individual-relative terms, and it’s something more like the proportion of attention pulled that matters.