I prefer to think about sentience as non-binary. In expectation, any being is arguably sentient to some extent, even if very little, so there is a sense in which āprobability of sentienceā is always very close to 1. Alas, we cannot rule out suffering in fundamental physics.
However, I guess we can still use āmoral weight | sentienceā*āprobability of sentienceā to estimate moral weights. In this case, āsentienceā would mean something more restrictive than what I am referring to above. In the post, I am assuming this product is directly proportional to the number of neurons. Do you think neurons are a better proxy for āmoral weight | (ārestrictiveā) sentienceā than for the product?
Hi Michael,
Thanks for sharing those estimates.
I prefer to think about sentience as non-binary. In expectation, any being is arguably sentient to some extent, even if very little, so there is a sense in which āprobability of sentienceā is always very close to 1. Alas, we cannot rule out suffering in fundamental physics.
However, I guess we can still use āmoral weight | sentienceā*āprobability of sentienceā to estimate moral weights. In this case, āsentienceā would mean something more restrictive than what I am referring to above. In the post, I am assuming this product is directly proportional to the number of neurons. Do you think neurons are a better proxy for āmoral weight | (ārestrictiveā) sentienceā than for the product?
Looking forward to that post!