I prefer to think about sentience as non-binary. In expectation, any being is arguably sentient to some extent, even if very little, so there is a sense in which “probability of sentience” is always very close to 1. Alas, we cannot rule out suffering in fundamental physics.
However, I guess we can still use “moral weight | sentience”*”probability of sentience” to estimate moral weights. In this case, “sentience” would mean something more restrictive than what I am referring to above. In the post, I am assuming this product is directly proportional to the number of neurons. Do you think neurons are a better proxy for “moral weight | (“restrictive”) sentience” than for the product?
You should also take the probability of sentience into account. There are some estimates from my colleagues at RP here: https://forum.effectivealtruism.org/posts/T5fSphiK6sQ6hyptX/opinion-estimating-invertebrate-sentience
We also have a post about the value/usefulness of neuron counts coming soon: https://forum.effectivealtruism.org/s/y5n47MfgrKvTLE3pw
Hi Michael,
Thanks for sharing those estimates.
I prefer to think about sentience as non-binary. In expectation, any being is arguably sentient to some extent, even if very little, so there is a sense in which “probability of sentience” is always very close to 1. Alas, we cannot rule out suffering in fundamental physics.
However, I guess we can still use “moral weight | sentience”*”probability of sentience” to estimate moral weights. In this case, “sentience” would mean something more restrictive than what I am referring to above. In the post, I am assuming this product is directly proportional to the number of neurons. Do you think neurons are a better proxy for “moral weight | (“restrictive”) sentience” than for the product?
Looking forward to that post!