Thanks for this response. It seems like we are coming at this topic from very different starting assumptions. If I’m understanding you correctly, you’re saying that we have no idea whether LLMs are conscious, so it doesn’t make sense to draw any inferences from them to other minds.
That’s fair enough, but I’m starting from the premise that LLMs in their current form are almost certainly not conscious. Of course, I can’t prove this. It’s my belief based on my understanding of their architecture. I’m very much not saying they lack consciousness because they aren’t instantiated in a biological brain. Rather, I don’t think that GPUs performing parallel searches through a probabilistic word space by themselves are likely to support consciousness.
Stepping back a bit: I can’t know if any animal other than myself is conscious, even fellow humans. I can only reason through induction that consciousness is a feature of my brain, so other animals that have brains similar in construction to mine may also have consciousness. And I can use the observed output of those brains—behavior—as an external proxy for internal function. This makes me highly confident that, for example, primates are conscious, with my uncertainty growing with greater evolutionary distance.
Now along come LLMs to throw a wrench in that inductive chain. LLMs are—in my view—zombies that can do things previously only humans were capable of. And the truth is, a mosquito’s brain doesn’t really have all that much in common with a human’s. So now I’m even more uncertain—is complex behavior really a sign for interiority? Does having a brain made of neurons really put lower animals on a continuum with humans? I’m not sure anymore.
I am using conscious and sentient as synonyms. Apologies if this is confusing.
I don’t doubt at all that all animals are sentient in the sense that you mean. But I am referring to the question of whether they have subjective experience—not just pleasure and pain signals but also a subjective experience of pleasure and pain.
This doesn’t feel like a red herring to me. Suffering only takes on a moral valence if it describes a conscious experience.