Thank you for doing this, Max (and the supporters). These are good questions that warrant their own book =)
I find this passage making a particularly good point, so I quote it below for those skipped that part:
In the case of hermit crabs, we find the relevant behavioral pattern. So, we may infer that, like us, they feel pain. To be sure, they have many fewer neurons. But why should we think that makes a difference to the presence of pain? It didn’t make any difference with respect to the complex pattern of behavior the crabs display in response to noxious stimuli. Why should it make any difference with respect to the cause of that behavior? It might, of course. There is no question of proof here. But that isn’t enough to overturn the inference.
We need to look more closely at invertebrate behavior and see whether and how much it matches ours with respect to a range of experiences—bodily, perceptual and emotional.
Comparing with humans, I suppose, should come with many caveats. Still, for ancient(?) feelings like fear and pain, the approach seems valid to my layman perspective in the area.
Of course, if one endorsed a type identity theory for conscious mental states, according to which experiences are one and the same as specific physico-chemical brain states, that would give one a reason to deny that digital beings lacked consciousness. But why accept the type identity theory? Given the diversity of sentient organisms in nature, it is extremely implausible to hold that for each type of experience, there is a single type of brain state with which it is identical.
If (globally bound) consciousness is “implemented” on a lower level, then it still may be possible for different physico-chemical brain states for the same qualia to be relevantly identical on that lower level. I mention this because IMO there are good reasons to be sceptical about digital consciousness.
[...] it is is extremely implausible to hold that [...]
Thank you for doing this, Max (and the supporters). These are good questions that warrant their own book =)
I find this passage making a particularly good point, so I quote it below for those skipped that part:
Comparing with humans, I suppose, should come with many caveats. Still, for ancient(?) feelings like fear and pain, the approach seems valid to my layman perspective in the area.
If (globally bound) consciousness is “implemented” on a lower level, then it still may be possible for different physico-chemical brain states for the same qualia to be relevantly identical on that lower level. I mention this because IMO there are good reasons to be sceptical about digital consciousness.
A typo
You are very welcome! :)
That passage is also one of my favourite parts of his answers, thanks for highlighting it.
I’ll take a look at that David Pearce post, thanks for the link.
Thanks for pointing at the typo, fixed it now.