Octopuses (Probably) Don’t Have Nine Minds

Link post

Key Takeaways

Here are the key takeaways for the full report:

  1. Based on the split-brain condition in humans, some people have wondered whether some humans “house” multiple subjects.

  2. Based on superficial parallels between the split-brain condition and the apparent neurological structures of some animals—such as chickens and octopuses—some people have wondered whether those animals house multiple subjects too.

  3. To assign a non-negligible credence to this possibility, we’d need evidence that parts of these animals aren’t just conscious, but that they have valenced conscious states (like pain), as that’s what matters morally (given our project’s assumptions).

  4. This evidence is difficult to get:

    1. The human case shows that unconscious mentality is powerful, so we can’t infer consciousness from many behaviors.

    2. Even when we can infer consciousness, we can’t necessarily infer a separate subject. After all, there are plausible interpretations of split-brain cases on which there are not separate subjects.

    3. Even if there are multiple subjects housed in an organism in some circumstances, it doesn’t follow that there are always multiple subjects. These additional subjects may only be generated in contexts that are irrelevant for practical purposes.

  5. If we don’t have any evidence that parts of these animals are conscious or that they have valenced conscious states, then insofar as we’re committed to having an empirically-driven approach to counting subjects, we shouldn’t postulate multiple subjects in these cases.

  6. That being said, the author is inclined to place up to a 0.1 credence that there are multiple subjects in the split-brain case, but no higher than 0.025 for the 1+8 model of octopuses.

Introduction

This is the sixth post in the Moral Weight Project Sequence. The aim of the sequence is to provide an overview of the research that Rethink Priorities conducted between May 2021 and October 2022 on interspecific cause prioritization—i.e., making resource allocation decisions across species. The aim of this post, which was written by Joe Gottlieb, is to summarize his full report on the phenomenal unity and cause prioritization, which explores whether, for certain species, there are empirical reasons to posit multiple welfare subjects per organism. That report is available here.

Motivations and the Bottom Line

We normally assume that there is one conscious subject—or one entity who undergoes conscious experiences—per conscious animal. But perhaps this isn’t always true: perhaps some animals ‘house’ more than one conscious subject. If those subjects are also welfare subjects—beings with the ability to accrue welfare goods and bad—then this might matter when trying to determine whether we are allocating resources in a way that maximizes expected welfare gained per dollar spent. When we theorize about these animals’ capacity for welfare, we would no longer be theorizing about a single welfare subject, but multiple such subjects.[1]

In humans, people have speculated about this possibility based on “split-brain” cases, where the corpus callosum has been wholly or partially severed (e.g., Bayne 2010; Schechter 2018). Some non-human animals, like birds, approximate the split-brain condition as the norm, and others, like the octopus, exhibit a striking lack of integration and highly decentralized nervous systems, with surprising levels of peripheral autonomy. And in the case of the octopus, Peter Godfrey-Smith suggests that “[w]e should…at least consider the possibility that an octopus is a being with multiple selves”, one for central brain, and then one for each arm (2020: 148; cf. Carls-Diamante 2017, 2019, 2022).

What follows is a high-level summary of my full report on this topic, focusing on Octopodidae, as if that’s the family for which we have the best evidence for multiple subjects per organism.[2] In assessing this possibility, I make three key assumptions:

  • Experiential hedonism: an entity can accrue welfare goods and bads if and only if it can undergo either positively or negatively conscious valenced mental states.

  • Mental states can be unconscious: most, if not all, conscious mental states have unconscious mental counterparts. Moreover, many sophisticated behaviors are caused by unconscious states, and even if caused by conscious states, they are not always caused by those states in virtue of being conscious. Unconscious mentality is quite powerful and routinely underestimated.

  • Default to One Subject Assumption: we begin by provisionally assuming that there is only one subject per animal, per organism, etc. Thus, absent sufficiently robust positive evidence against this default assumption, we should continue to assume that there is one subject per octopus.

With these assumptions in mind, there are two hypotheses of interest:

  • The Action-Relevant Hypothesis: The default condition for members of Octopodidae is that they house up to 9 welfare subjects, such that for any harm or benefit to any token octopus, we get a 9x impact in expectation.

  • The Non-Action-Relevant Hypothesis: There are some rare contexts—when all arms are amputated, or when the brain is not exerting central control over the arms—where members of Octopodidae either house up to 9 welfare subjects or can ‘splinter’ into 9 welfare subjects, such that for any harm or benefit to any token octopus, we get a 9x impact in expectation.

The bottom line is that, based on the arguments I discuss at length in the full report, I assign a credence of 0.025 to the Action-Relevant Hypothesis and a credence of 0.035 to the Non-Action-Relevant Hypothesis.

Four Questions about There Being Multiple Subjects Per Octopus

In the academic philosophical literature, there is only clear endorsement of, and extended argument for, the claim that there can be more than one subject per animal: namely, Schechter’s (2018) examination of the split-brain condition. However, her arguments do not readily carry over to octopuses. There are several reasons for this, but the most relevant one is this. Schechter’s case starts from the claim that both the right and left hemispheric systems in humans can independently support conscious experience. Then, she infers that the reason why these experiences are not part of a single phenomenally unified experiential perspective is because they fail to be access unified.[3] Since subjects, according to Schechter, are individuated by experiential perspectives, it follows that split-brain patients house two subjects.

This argument makes a highly contentious assumption: namely, that failures of access unity entail failures of phenomenal unity.[4] Even if we grant it, we can’t make an analogous starting assumption to the effect that each octopus arm is, on its own, sufficient for consciousness.[5] The question—or at least one of our questions—is whether this assumption is true.

So, we can split our task into three questions, where a ‘yes’ answer to each subsequent one is predicated on a ‘yes’ to the prior question:

  1. The Mind Question: Do octopuses generally house 9 minded subjects (or at least more than one minded subject)?

  2. The Conscious Mind Question: Do octopuses generally house 9 consciously minded subjects—that is, 9 subjects capable of being in conscious mental states (or at least more than one consciously minded subject)?

  3. The Welfare Subject Question: Do octopuses generally house 9 conscious, affectively minded subjects—that is, 9 subjects capable of being in conscious affective mental states (or at least more than one conscious, affectively minded subject)?

  4. The Correlation Question: If octopuses generally house more than one conscious, affectively minded subject, are the harms and benefits of those subjects correlated, such that harming one subject affects the welfare of other subjects housed in the same organism?

Q3 and Q4, of course, are what ultimately matter—but we have to get through Q1 and Q2 to get to them. My take is that there’s some evidence for a ‘yes’ answer to Q1, but no evidence for a ‘yes’ answer to Q2 and Q3. So, Q4 doesn’t even come up.

Question 1: The Mind Question

Carls-Diamante (2019) provides the most sustained case that each arm constitutes an independent cognitive system, i.e., not merely a cognitive subsystem. If we let each independent cognitive system count as a subject, it follows that each arm constitutes a subject. Carls-Diamante’s case hinges largely on the cognitive role of the arms in storing and carrying out stereotypic motor programs (as seen in fetching), along with their functional autonomy and self-sufficiency, illustrated by the sparse connections between them and the central components of the octopus nervous system. This autonomy is most striking in cases of amputation, where the arms retain much of their sensorimotor control and processing functions, including grasping behavior elicited by tactile stimulation of the suckers (Rowell 1963).

But there are at least two problems with Carls-Diamante’s position. First, by her own lights, she’s working with a “relaxed” (2019: 465) stance on what counts as cognition, saying that at best sensorimotor coordination is the most “rudimentary” form of cognition. This is a controversial and deflationary interpretation of cognition. That’s fine in itself, but it significantly weakens the inference to consciousness. Second, if we only have multiple subjects when the octopus arms are amputated, as Carls-Diamante suggests (2019: 478), then we might get a case for conscious arms, but it will be a case that’s basically irrelevant to The Action-Relevant Hypothesis, since arms usually aren’t amputated prior to death.[6]

Question 2: The Conscious Mind Question

There’s little doubt that whole octopuses are conscious (Mather 2008). But the animal’s being conscious doesn’t imply that each octopus arm is individually conscious. If we grant that each arm is a subject (because it constitutes an independent cognitive system), then we can ask whether the states of those arm-based subjects are conscious in much the same way as we would ask this question of anything else.

How do we assess consciousness? Typically, we either look for proxies for consciousness, such as exhibiting trace conditioning (Birch 2022), or we reason from a theory. Either way, there doesn’t seem to be any positive evidence for thinking that the octopus arms are conscious. Sensory-motor processing isn’t necessarily conscious, we have no evidence that the arm-based systems have a global workspace or are capable of higher-order representation, and the arms don’t exhibit trace-conditioning, rapid-reversal learning, or anything else that might serve as a positive marker. Thus, given that Mental States can be Unconscious and the Default to One Subject Assumption, there is no reason to think that the octopus houses more than one conscious subject.[7]

Question 3: The Welfare Subject Question

Suppose that each arm-based system has its own conscious states. This does not mean that these states are affective. Given experiential hedonism, this is a necessary (and sufficient) condition for these systems to constitute welfare subjects. But whether the arms have conscious affective states depends on the kinds of states they have, which even those sympathetic to there being multiple subjects, like Carls-Diamante (2017: 1279), take to be quite limited. Of course, an octopus arm-based subject only needs to instantiate one kind of affective state to be a welfare subject. But we have no evidence that arm-based subjects can feel sad or happy or undergo prolonged bouts of depression, nor even that they can be in pain. Now again—keeping with a common refrain—we do have evidence that octopuses can be in pain. For example, in a study by Crook (2021) of responses to injury in the pygmy octopus (Octopus bocki), directed grooming at the location of acetic acid injection was demonstrated only for that grooming to cease upon application of lidocaine. In addition, the octopuses preferred chambers in which they had been placed after being given lidocaine over chambers in which they were given the initial injection. This could be evidence of valenced pain. However, Crook (Ibid.) is clear that noxious sensory information is not processed in the arms, but in the central brain. This suggests that when acetic acid is injected into one of the arms, pain may be felt in the arm but not by the arm.

On the other hand, in Rowell’s (1963) experiments on amputated octopus arms, it was found that pricking am amputated arm with a pin resulted in flinching of the skin and the arm moving away from the direction of the stimulus. Does this suggest that the arm-based systems are in conscious pain? There are three points to make here. First, given that this involves amputated arms, there is again the question of whether this speaks to the Action-Relevant or the Non-Action-Relevant Hypothesis. Second, it isn’t obvious that such behavior marks valenced pain instead of (just) pain since we have evidence from pain asymbolia (Bain 2014) but also more mundane cases that pain is not necessarily painful. Valence requires more than mere withdrawal and reactive behaviors (Shevlin 2021).[8]

Finally, while this behavior can be by caused conscious pain states, as before, this doesn’t mean that such states cause such behaviors in virtue of being conscious. Indeed, we have evidence that withdrawal behavior is frequently unconscious. Noxious stimulation can cause humans in vegetative states to yell, withdraw or display ‘pained’ facial expressions (Laureys 2007). In addition, the lower limbs of complete spinal cord patients, in which the patients cannot feel anything, still exhibit the withdrawal flexion reflex (Dimitrijević & Nathan 1968; Key 2016: 4).[9]

Conclusion

The upshot here is straightforward. We don’t seem to have a good reason to suppose that independent octopus arms are conscious subjects, much less welfare subjects. And if they aren’t, then we should assign very low credences to the key hypotheses:

  • The Action-Relevant Hypothesis: The default condition for members of Octopodidae is that they house up to 9 welfare subjects, such that for any harm or benefit to any token octopus, we get a 9x impact in expectation.

  • The Non-Action-Relevant Hypothesis: There are some rare contexts—when all arms are amputated, or when the brain is not exerting central control over the arms—where members of Octopodidae either house up to 9 welfare subjects or can ‘splinter’ into 9 welfare subjects, such that for any harm or benefit to any token octopus, we get a 9x impact in expectation.

In the full report, I argue for all this in more detail; I also make the same point about chickens. Whatever the intuitive appeal of the possibility of there being multiple subjects per organism in these cases, that possibility probably isn’t the way things are.

Acknowledgments


This research is a project of Rethink Priorities. It was written by Joe Gottlieb. Thanks to Bob Fischer for helpful feedback on this research. If you’re interested in RP’s work, you can learn more by visiting our research database. For regular updates, please consider subscribing to our newsletter.

References

Bain, D. (2014). Pains that don’t hurt. Australasian Journal of Philosophy, 92(2), 305-320. https://​​doi.org/​​10.1080/​​00048402.2013.822399

Bayne, T. (2010). The Unity of Consciousness. Oxford University Press. https://​​doi.org/​​10.1093/​​acprof:oso/​​9780199215386.001.0001

Birch J. (2022) The search for invertebrate consciousness. Nous. 56(1): 133-153.

Block, N. (2007). Consciousness, accessibility, and the mesh between psychology and neuroscience. Behavioral and Brain Sciences, 30(5-6), 481-499. https://​​doi.org/​​10.1017/​​S0140525X07002786

Bublitz A, Dehnhardt G and Hanke FD (2021). Reversal of a Spatial Discrimination Task in the Common Octopus (Octopus vulgaris). Front. Behav. Neurosci. 15:614523.
doi: 10.3389/​fnbeh.2021.614523

Carruthers, P. (2018). Valence and Value. Philosophy and Phenomenological Research, 97, 658-680. https://​​doi.org/​​10.1111/​​phpr.12395

Carls-Diamante, S. (2017). The octopus and the unity of consciousness. Biology and Philosophy, 32, 1269-1287. https://​​doi.org/​​10.1007/​​s10539-017-9604-0

Carls-Diamante, S. (2019). Out on a limb? On multiple cognitive systems within the octopus nervous system. Philosophical Psychology, 32(4), 463–482. https://​​doi.org/​​10.1080/​​09515089.2019.1585797

Dimitrijevi´c, M. R., & Nathan, P. W. (1968). Studies of spasticity in man: Analysis of reflex activity evoked by noxious cutaneous stimulation. Brain, 91(2), 349–368. https://​​doi.org/​​10.1093/​​brain/​​91.2.349

Dung, L. (2022). Assessing Tests of Animal Consciousness. Consciousness and Cognition 105: 103410.

Godfrey-Smith, P. (2020). Metazoa: Animal life and the birth of the mind. Farrar, Straus and Giroux.

Irvine, E. (2020). Developing Valid Behavioral Indicators of Animal Pain. Philosophical Topics, 48(1), 129–153. https://​​doi.org/​​10.5840/​​philtopics20204817

Key, B. (2016). Why fish do not feel pain. Animal Sentience, 1(3).

Laureys, S. (2007). Eyes Open, Brain Shut. Scientific American, 296(5), 84–89. https://​​doi.org/​​10.1038/​​scientificamerican0507-84

Marks, C. (1980). Commissurotomy, Consciousness, and Unity of Mind. MIT Press.

Mather, J. (2008). Cephalopod consciousness: Behavioural evidence. Consciousness and Cognition, 17: 37–48

Rowell, C. H. F. (1963). Excitatory and inhibitory pathways in the arm of octopus. Journal of Experimental Biology, 40, 257–270.

Schechter, E. (2018). Self- Consciousness and “Split” Brains: The Minds’ I. Oxford University Press.

Shevlin, H. (2021). Negative valence, felt unpleasantness, and animal welfare. https://​​henryshevlin.com/​​wp-content/​​uploads/​​2021/​​11/​​Felt-unpleasantness.pdf

Tye, M. (2003). Consciousness and Persons: Unity and Identity. MIT Press.

Notes

[1] This, of course, is setting aside the welfare ranges for each of these constituent subject; it could be that, within an individual animal, while there are multiple subjects, not all subjects have same welfare ranges, with some being far narrower than others.

[2] The full report also includes extensive discussion of the split-brain condition in humans, along with a discussion of whether members of Gallus gallus domesticus house more than one welfare subject.

[3] Two experiences E1 and E2 are phenomenally unified when there is something it is like to have E1 and E2 together in a way that is not merely conjunctive. Two experiences E1 and E2 are access unified when their contents can be jointly reported on, and jointly employed in the rational control of reasoning and behavior.

[4] This is rejected, for example, by Bayne (2010). Also relevant here is experimental evidence (e.g., from the Sperling paradigm) for phenomenal overflow: phenomenally conscious states that are not accessed, if not accessible at all (Block 2007).

[5] This is probably why one of the few people to write on this topic, Carls-Diamante’s (2017), conditionalizes her thesis: “if the brain and the arms can generate local conscious fields, the issue arises as to whether subjective experience in an octopus would be integrated or unified, given the sparseness of interactions between the components of its nervous system” (2017: 1273, emphasis added).

[6] This is reminiscent of the “contextualist model” of split-brain patients, where we only have two subjects when under experimental conditions (Marks 1980; Tye 2003). Godfrey-Smith (2020) favors something like this approach for the octopus, but even then, he still thinks there is so-called partial unity for affective states. Roughly, this means that while there are contextually two subjects, there is only one (e.g.) token pain “shared” across each subject in those contexts.

[7] Interestingly, it has been argued that octopuses do have something akin to a global workspace (Mather 2008) and are capable of rapid-reversal learning (Bublitz et al 2021), but again, this does not tell us that the arms have and can do these things. Presumably, if (somehow) my mental states were in your global workspace, that wouldn’t make me have conscious experiences.

[8] Even if these states were valenced, this wouldn’t necessarily show that these states were conscious (see fn. 4), or that octopus arms were conscious (by having any other conscious states, for instance). Motivational trade-offs are a hallmark of valenced states, but Irvine (2020) argues that even C. elegans perform such trade-offs. Presumably, C. elegans are not conscious in any way.

[9] For further discussion on this, see Dung (2022).