Here’s my take on this issue. I think that mistaking pain sensation for moral worth is a big problem, and leading to significant moral costs in our community. I believe that we need to reevaluate what neural structure & function (or other compute substrate) qualifies for moral worth, and to what degree. This should be an empirical study of the physical properties of the system, as well as a philosophical discussion over what attributes matter.
As a starting point, I suggest we use the presence of a social contract between sapient beings capable of verbally communicating preferences. Then should we consider non-verbal communication. In the lack of any sort of abstract communication, I think we should be extremely suspicious of the entity involved having sufficient computational complexity to have moral worth.
Here’s my take on this issue. I think that mistaking pain sensation for moral worth is a big problem, and leading to significant moral costs in our community. I believe that we need to reevaluate what neural structure & function (or other compute substrate) qualifies for moral worth, and to what degree. This should be an empirical study of the physical properties of the system, as well as a philosophical discussion over what attributes matter.
As a starting point, I suggest we use the presence of a social contract between sapient beings capable of verbally communicating preferences. Then should we consider non-verbal communication. In the lack of any sort of abstract communication, I think we should be extremely suspicious of the entity involved having sufficient computational complexity to have moral worth.