Thanks Max—More research in this space feels important. For me, degrees of sentience should determine how much moral consideration we should grant to things (animals, humans, maybe even aliens and AGIs).
I don’t think I agree with your claim in the article that degrees of sentience has been scientifically demonstrated. Is there a source you have in mind for that? I’ve been looking at the literature on the topic and it seems like the arguments that there do exist degrees of sentience are based in philosophy and none are that strong.
I guess the reason you are using sentientism rather than hedonistic utilitarianism is because you think the term sounds better/has a better framing?
I’m an amateur here so my confidence level isn’t necessarily that high. I am taking “degrees of sentience” from the research (as summarised in Luke’s paper) that shows varying levels of complexity in the nervous systems that generate sentience and the behaviours that demonstrate it. Given sentience is a subjective experience it’s hard to judge its quality or intensity directly. However, from examining behaviour and hardware / biology, it does appear that some types of sentience are likely to be richer than others (insect vs. human for example). Arguably, that could warrant different degrees of moral consideration. I suspect that, while we will want to define a lower boundary of sentience for ethical consideration reasons, we may never find a clear binary edge. Sentience is likely to be just a particular class of advanced information processing.
I’m using the term sentientism partly because it helps focus on sentience as the primary determinant of which beings deserve moral consideration. We can use it to take decisions about whether to have compassion for humans, non-human animals and potentially even sentient AGIs or aliens. Hedonistic Utilitarianism implies sentience (given it focuses on the experiences of pleasure / suffering) - but has traditionally (despite Bentham) focused only on human experience.
Sentientism, like Humanism, also has an explicit commitment to evidence and reason—rejecting supernatural rationales for morality. As I understand hedonistic utilitarianism it is neutral on that perspective.
For anyone interested in refining these ideas, we run a friendly, global group re: Sentientism here: https://www.facebook.com/groups/sentientism/ . All welcome whether or not the term fits personally. Philosophers, writers, activists, policy people + interested lay people (like me) from 43 countries so far.
Thanks Max—More research in this space feels important. For me, degrees of sentience should determine how much moral consideration we should grant to things (animals, humans, maybe even aliens and AGIs).
I wrote this re: sentientism—may be of interest https://secularhumanism.org/2019/04/humanism-needs-an-upgrade-is-sentientism-the-philosophy-that-could-save-the-world/ .
Thanks Jamie!
Nice article. Thanks for the link.
I don’t think I agree with your claim in the article that degrees of sentience has been scientifically demonstrated. Is there a source you have in mind for that? I’ve been looking at the literature on the topic and it seems like the arguments that there do exist degrees of sentience are based in philosophy and none are that strong.
I guess the reason you are using sentientism rather than hedonistic utilitarianism is because you think the term sounds better/has a better framing?
Thanks Max.
I’m an amateur here so my confidence level isn’t necessarily that high. I am taking “degrees of sentience” from the research (as summarised in Luke’s paper) that shows varying levels of complexity in the nervous systems that generate sentience and the behaviours that demonstrate it. Given sentience is a subjective experience it’s hard to judge its quality or intensity directly. However, from examining behaviour and hardware / biology, it does appear that some types of sentience are likely to be richer than others (insect vs. human for example). Arguably, that could warrant different degrees of moral consideration. I suspect that, while we will want to define a lower boundary of sentience for ethical consideration reasons, we may never find a clear binary edge. Sentience is likely to be just a particular class of advanced information processing.
I’m using the term sentientism partly because it helps focus on sentience as the primary determinant of which beings deserve moral consideration. We can use it to take decisions about whether to have compassion for humans, non-human animals and potentially even sentient AGIs or aliens. Hedonistic Utilitarianism implies sentience (given it focuses on the experiences of pleasure / suffering) - but has traditionally (despite Bentham) focused only on human experience.
Sentientism, like Humanism, also has an explicit commitment to evidence and reason—rejecting supernatural rationales for morality. As I understand hedonistic utilitarianism it is neutral on that perspective.
For anyone interested in refining these ideas, we run a friendly, global group re: Sentientism here: https://www.facebook.com/groups/sentientism/ . All welcome whether or not the term fits personally. Philosophers, writers, activists, policy people + interested lay people (like me) from 43 countries so far.
Yeah, fair enough. I wish you good luck with your group and project :)