Not to rehash everyone’s well-rehearsed position on the hard problem, but surely in the above sentience is the red herring? If non-human animals are not conscious, i.e. “there are no lights on inside” not just “the lights are on but dimmer”, then there is actually no suffering?
Edit: A good intuition pump on this crux is David Chalmer’s ‘Vulcan’ thought experiment (see the 80k podcast transcript) - my intuition tells me we care about the Vulcans, but maybe the dominant philosophy of mind position in EA is to not care about them (I might be confounding overlap between illusionism and negative utiliarianism though)? That seems like a pretty big crux to me.
I don’t see, at the evolutionary-functional level, why human-type ‘consciousness’ (whatever that means) would be required for sentience (adaptive responsiveness to positive/negative reinforcers, i.e. pleasure/pain). Sentience seems much more foundational, operationalizable, testable, functional, and clear.
But then, 99% of philosophical writing about consciousness strikes me as wildly misguided, speculative, vague, and irrelevant.
Psychology has been studying ‘consciousness’ ever since the 1850s, and has made a lot of progress. Philosophy, not so much, IMHO.
Follow-up: I’ve never found Chalmers’ zombie or vulcan thought experiments at all compelling. They sound plausible at first glance as interesting edge cases, but I think they’re not at all plausible or illuminating if one asks how such a hypothetical being could have evolved, and whether their cognitive/affective architecture really makes sense. The notion of a mind that doesn’t have any valences regarding external objects, beings, or situations would boil down to a mind that can’t make any decisions, can’t learn anything (through operant conditioning), and can’t pursue any goals—i.e. not a ‘mind’ at all.
I critiqued the Chalmers zombie thought experiment in this essay from c. 1999. Also see this shorter essay about the possible functions of human consciousness, which I think center around ‘public relations’ functions in our hypersocial tribal context, more than anything else.
Not to rehash everyone’s well-rehearsed position on the hard problem, but surely in the above sentience is the red herring? If non-human animals are not conscious, i.e. “there are no lights on inside” not just “the lights are on but dimmer”, then there is actually no suffering?
Edit: A good intuition pump on this crux is David Chalmer’s ‘Vulcan’ thought experiment (see the 80k podcast transcript) - my intuition tells me we care about the Vulcans, but maybe the dominant philosophy of mind position in EA is to not care about them (I might be confounding overlap between illusionism and negative utiliarianism though)? That seems like a pretty big crux to me.
I don’t see, at the evolutionary-functional level, why human-type ‘consciousness’ (whatever that means) would be required for sentience (adaptive responsiveness to positive/negative reinforcers, i.e. pleasure/pain). Sentience seems much more foundational, operationalizable, testable, functional, and clear.
But then, 99% of philosophical writing about consciousness strikes me as wildly misguided, speculative, vague, and irrelevant.
Psychology has been studying ‘consciousness’ ever since the 1850s, and has made a lot of progress. Philosophy, not so much, IMHO.
Follow-up: I’ve never found Chalmers’ zombie or vulcan thought experiments at all compelling. They sound plausible at first glance as interesting edge cases, but I think they’re not at all plausible or illuminating if one asks how such a hypothetical being could have evolved, and whether their cognitive/affective architecture really makes sense. The notion of a mind that doesn’t have any valences regarding external objects, beings, or situations would boil down to a mind that can’t make any decisions, can’t learn anything (through operant conditioning), and can’t pursue any goals—i.e. not a ‘mind’ at all.
I critiqued the Chalmers zombie thought experiment in this essay from c. 1999. Also see this shorter essay about the possible functions of human consciousness, which I think center around ‘public relations’ functions in our hypersocial tribal context, more than anything else.