I prefer to think about sentience as non-binary. In expectation, any being is arguably sentient to some extent, even if very little, so there is a sense in which “probability of sentience” is always very close to 1. Alas, we cannot rule out suffering in fundamental physics.
However, I guess we can still use “moral weight | sentience”*”probability of sentience” to estimate moral weights. In this case, “sentience” would mean something more restrictive than what I am referring to above. In the post, I am assuming this product is directly proportional to the number of neurons. Do you think neurons are a better proxy for “moral weight | (“restrictive”) sentience” than for the product?
Hi Michael,
Thanks for sharing those estimates.
I prefer to think about sentience as non-binary. In expectation, any being is arguably sentient to some extent, even if very little, so there is a sense in which “probability of sentience” is always very close to 1. Alas, we cannot rule out suffering in fundamental physics.
However, I guess we can still use “moral weight | sentience”*”probability of sentience” to estimate moral weights. In this case, “sentience” would mean something more restrictive than what I am referring to above. In the post, I am assuming this product is directly proportional to the number of neurons. Do you think neurons are a better proxy for “moral weight | (“restrictive”) sentience” than for the product?
Looking forward to that post!