Interview with Michael Tye about invertebrate consciousness

Michael Tye is a promi­nent philoso­pher fo­cus­ing on philos­o­phy of mind. His in­ter­ests in­clude the philos­o­phy of an­i­mal minds and con­scious­ness and in 2016 he pub­lished the book Tense Bees and Shel­lshocked Crabs: Are An­i­mals Con­scious? This book fea­tures one of the most in-depth dis­cus­sions of in­ver­te­brate con­scious­ness available.

This in­ter­view is about po­ten­tial phe­nom­e­nal con­scious­ness (es­pe­cially con­scious pain) in in­ver­te­brates (es­pe­cially in­sects). This is part of an in­ter­view se­ries where I in­ter­view lead­ing ex­perts on in­ver­te­brate con­scious­ness to try and make progress on the ques­tion. You can find my pre­vi­ous in­ter­view with Jon Mal­latt here, my pre­vi­ous in­ter­view with Shel­ley Adamo here, and a post where I jus­tify my en­gage­ment with this ques­tion here.

1. You write that we are en­ti­tled to pre­fer the propo­si­tion that many an­i­mals in­clud­ing bees and crabs are phe­nom­e­nally con­scious be­cause they ap­pear to be con­scious, and be­cause sup­pos­ing that they go through what looks like pain, but isn’t ac­tu­ally pain, is ad hoc. What about the coun­ter­ar­gu­ment that hon­ey­bees and crabs just have many fewer neu­rons than we do and must econ­o­mize on space, and so it seems rea­son­able to imag­ine that they do not have some nec­es­sary com­po­nent of what makes these ex­pe­riences con­scious in us?

Hu­mans and mam­mals that are in pain be­have in char­ac­ter­is­tic ways. This be­hav­ior is com­plex, in­volv­ing much, much more than sim­ply with­draw­ing the body from the dam­ag­ing or nox­ious stim­u­lus, and it is caused by the feel­ing of pain. (In my 2016, I list var­i­ous com­po­nents of this be­hav­ior.) If we find a very similar pat­tern of be­hav­ior in other an­i­mals, we are en­ti­tled to in­fer that the same cause is op­er­a­tive un­less we have good rea­son to think that their case is differ­ent. This was a point that Sir Isaac New­ton made long ago with re­spect to effects in na­ture gen­er­ally and their causes. In the case of her­mit crabs, we find the rele­vant be­hav­ioral pat­tern. So, we may in­fer that, like us, they feel pain. To be sure, they have many fewer neu­rons. But why should we think that makes a differ­ence to the pres­ence of pain? It didn’t make any differ­ence with re­spect to the com­plex pat­tern of be­hav­ior the crabs dis­play in re­sponse to nox­ious stim­uli. Why should it make any differ­ence with re­spect to the cause of that be­hav­ior? It might, of course. There is no ques­tion of proof here. But that isn’t enough to over­turn the in­fer­ence. One other minor point: pain is a feel­ing. As such it is in­her­ently a con­scious state. Ne­c­es­sar­ily, there is some­thing it is like to un­dergo pain. So, the ques­tion: “What makes the ex­pe­rience of pain con­scious?” is re­ally not co­her­ent. If the ex­pe­rience of pain is pre­sent, it is au­to­mat­i­cally con­scious.

2. Do you ex­pect that there will be clean de­ter­mi­nate an­swers about ex­actly which en­tities are con­scious now or at some point in the fu­ture? Or do you think there will always be a grey area with a judge­ment call in­volved?

Con­scious­ness it­self is not a grey mat­ter. There may be grey­ness with re­spect to the con­tent of con­scious­ness (for ex­am­ple, am I feel­ing pain or pres­sure, as my tooth is be­ing filled?) but not with re­spect to con­scious­ness it­self. Con­scious­ness does not have bor­der­line cases in the way that life does. Still, con­fronted with a much sim­pler or­ganism, we may not be able to tell from its be­hav­ior whether it has a faint glim­mer of con­scious­ness. I see no rea­son to sup­pose that in each and ev­ery case, we will be able to know with any strong de­gree of cer­tainty whether con­scious­ness is pre­sent.

3. Do you think it is likely that C el­e­gans (the ne­ma­tode worm with around 300 neu­rons) is con­scious?

Un­likely. There is noth­ing in the be­hav­ior of the ne­ma­tode worm that in­di­cates the pres­ence of con­scious­ness. It is a sim­ple stim­u­lus-re­sponse sys­tem with­out any flex­i­bil­ity in its be­hav­ior. The same is true of the leech.

4. Have you up­dated your po­si­tion at all since your 2016 book?

I have finished the draft of a new book on vague­ness and the evolu­tion of con­scious­ness. In it, I say some­thing about whether there is a trig­ger for con­scious­ness in mam­mal brains and I also say some ad­di­tional things about the na­ture of con­scious­ness from my per­spec­tive. This de­vel­ops fur­ther claims I have made in the past and con­nects them to global workspace the­ory. The book will likely be pub­lished in 2020 or early 2021.

5. Do you have any ideas about what the next best steps could be to get to a more cer­tain con­clu­sion about in­ver­te­brate con­scious­ness?

We need to look more closely at in­ver­te­brate be­hav­ior and see whether and how much it matches ours with re­spect to a range of ex­pe­riences—bod­ily, per­cep­tual and emo­tional.

6. Con­cern­ing digi­tal minds, do you think that any mind that satis­fied some gen­eral high-level con­di­tions, such as be­hav­ing similarly to an an­i­mal we be­lieve to be con­scious, would also be con­scious? Or do you think it would re­quire a quite similar pro­cess or ar­chi­tec­ture to what we find in hu­man brains?

Be­hav­ior is ob­vi­ously not the same as men­tal states. But be­hav­ior is ev­i­dence for men­tal states, whether ex­pe­ri­en­tial states or not. If we man­age to build highly com­plex sys­tems whose be­hav­ior mir­rors ours or at least is close to it for a range of men­tal states, we are en­ti­tled to in­fer that they are sub­ject to men­tal states too, un­less, as noted above, we have good rea­son to think that their case is differ­ent. Merely not­ing that they are made of sili­con is not enough. After all, what rea­son is there to sup­pose that cru­cially makes a differ­ence? Of course, if one en­dorsed a type iden­tity the­ory for con­scious men­tal states, ac­cord­ing to which ex­pe­riences are one and the same as spe­cific physico-chem­i­cal brain states, that would give one a rea­son to deny that digi­tal be­ings lacked con­scious­ness. But why ac­cept the type iden­tity the­ory? Given the di­ver­sity of sen­tient or­ganisms in na­ture, it is ex­tremely im­plau­si­ble to hold that for each type of ex­pe­rience, there is a sin­gle type of brain state with which it is iden­ti­cal. The most plau­si­ble view is that ex­pe­riences are mul­ti­ply phys­i­cally re­al­ized.

7. What do you think of the ev­i­dence that com­plex cog­ni­tion is able to hap­pen un­con­sciously in hu­man some­times. This should ar­guably make us con­clude that con­scious­ness is at least not strictly re­quired to see many of the sorts of in­di­ca­tors of con­scious­ness that we see in hon­ey­bees and crabs. Do you think this pre­sents a challenge to claims that in­ver­te­brates might be con­scious?

It is cer­tainly true that cog­ni­tion can oc­cur with­out con­scious­ness. Con­sider, for ex­am­ple, stim­uli that are briefly pre­sented to sub­jects, and that are then back­wardly masked so as to make them un­con­scious. They may still be pro­cessed deeply, with the re­sult that they have high-level con­tent that can prime sub­se­quent be­hav­ior. In some of these cases, with slightly differ­ent timing and in­ten­sity set­tings, the back­wardly masked stim­uli may still be visi­ble. Where this hap­pens, the im­me­di­ate be­hav­ior of sub­jects is very differ­ent. Why? The ob­vi­ous an­swer is that it is the fact that the sub­jects are con­scious of the stim­uli in these cases that makes their im­me­di­ate be­hav­ior differ­ent. So, the is­sue again then is whether the be­hav­ior we see in hon­ey­bees and crabs is of the former sort or whether it is more like the be­hav­ior mam­mals un­dergo in re­sponse to their con­scious states. The an­swer, I think, is that it is more like the lat­ter. It is also worth point­ing out that com­plex un­con­scious cog­ni­tion in hu­mans goes along with con­scious ac­tivity too. Why think that if there is com­plex un­con­scious cog­ni­tion in some cases in the in­ver­te­brate realm, it oc­curs there with­out con­scious­ness be­ing pre­sent in other cases?

8. We can dis­t­in­guish be­tween two as­pects of pain and they can oc­cur in­de­pen­dently: sen­sory (in­clud­ing qual­ities such as burn­ing, stab­bing, and aching) and af­fec­tive (the in­ten­sity or un­pleas­ant­ness). If in­sects can feel con­scious pain, do you think it is likely that they would feel a lower de­gree of af­fec­tive pain than hu­mans? In other words, would it make sense to say they feel only a frac­tion of the amount of af­fec­tive pain as a hu­man would ‘in similar cir­cum­stances.’

We know that pa­tients who suffer in­tractable pain and who have un­der­gone pre­frontal leuko­tomies to re­duce their pain level re­port that they still feel pain but they no longer mind it. For these pa­tients, the af­fec­tive com­po­nent of pain has been re­moved. This is in­di­cated in their be­hav­ior. The ques­tion for other or­ganisms is again how much their ‘pain’ be­hav­ior is like ours. To the ex­tent that they re­spond as we do, that is ev­i­dence that they feel what we do. If their be­hav­ior is more muted in var­i­ous ways, that would be ev­i­dence that their pains are not as in­tense or un­pleas­ant. In this re­gard, I might note that it makes sense to sup­pose that their pains are ac­tu­ally more in­tense than ours! After all, they are much less in­tel­li­gent than we are, so it would not be un­rea­son­able to sup­pose that Mother Na­ture would give them a big­ger jolt of pain than we re­ceive in re­sponse to nox­ious stim­uli in or­der to get them to be­have in ways most con­ducive to their sur­vival.





Many thanks to an anony­mous donor and the EA ho­tel for fund­ing me to con­duct this in­ter­view. Thanks also to Rhys Southan for pro­vid­ing sug­ges­tions and feed­back.