Yes, I recognize that some longtermists bite the bullet and admit that humanity virtually only have instrumental values, but I am not sure if they are the majority, it seems like they are not. In any case, it seems to me that the vast majority of longtermists either think the focus should be humanity, or digital beings. Animals are almost always left out of the picture.
I think you are right that “part of this” is a strategy to avoid weird messaging, but I think most longtermists I discussed with do not think that humanity do not matter, probably especially with new longtermists. Also, naming of initiatives such as human compatible AI, value alignment, learning from humans, etc, makes me feel that these people genuinely care about the future of humanity.
And I am not even sure digital sentience is even possible, we haven’t even proven that it is possible, right? And I don’t even know how to think about the feasibility of digital sentience. Maybe you can introduce me to some readings?
I find the neuron count model implausible. 1. Human infants have more neurons than adult humans. 2. Some nonhuman animals have more neurons than humans (btw, I have some credence, albeit low, that some nonhuman animals have higher moral weights than humans, 1v1) . 3. Using the neuron count model would also create seemingly absurd prescriptions. The total number of nematode neurons exceed that of humans, which would prescribe a focus on nematodes more than humans, which would sound no less absurd than focusing all on insect larvae. (nonetheless, I don’t put 0 credence to these possibilities) 4. There are evidence that within humans, the capacity to suffer has great variety, down to the extreme which some humans barely ever feel pain or suffer, and there were no evidence these vast differences was because of neuron counts.
In any case, my aim for this post is literally to present the number of animals, while also made the case that I expect most of these animals to be either fish or insects. Essentially this leaves the readers to judge how they deduce the importance of farmed animals in their moral and cause prioritization.
Yes, all those first points make sense. I did want to just point to where I see the most likely cruxes.
Re: neuron count, the idea would be to use various transformations of neuron counts, or of a particular type of neuron. I think it’s a judgment call whether to leave it to the readers to judge; I would prefer giving what one thinks is the most plausible benchmark way of counting and then giving the tools to adjust from there, but your approach is sensible too.
Sorry that I missed your comment and therefore the late reply!
Thank you for sharing. Let me clarify your suggestion here, do you mean you suggest me to give my model of accounting for moral significance, rather than just writing about the number of beings involved?
Also, do you mind sharing your credence of the possibility of digital sentience?
Thank you for your comment!
Yes, I recognize that some longtermists bite the bullet and admit that humanity virtually only have instrumental values, but I am not sure if they are the majority, it seems like they are not. In any case, it seems to me that the vast majority of longtermists either think the focus should be humanity, or digital beings. Animals are almost always left out of the picture.
I think you are right that “part of this” is a strategy to avoid weird messaging, but I think most longtermists I discussed with do not think that humanity do not matter, probably especially with new longtermists. Also, naming of initiatives such as human compatible AI, value alignment, learning from humans, etc, makes me feel that these people genuinely care about the future of humanity.
And I am not even sure digital sentience is even possible, we haven’t even proven that it is possible, right? And I don’t even know how to think about the feasibility of digital sentience. Maybe you can introduce me to some readings?
I find the neuron count model implausible. 1. Human infants have more neurons than adult humans. 2. Some nonhuman animals have more neurons than humans (btw, I have some credence, albeit low, that some nonhuman animals have higher moral weights than humans, 1v1) . 3. Using the neuron count model would also create seemingly absurd prescriptions. The total number of nematode neurons exceed that of humans, which would prescribe a focus on nematodes more than humans, which would sound no less absurd than focusing all on insect larvae. (nonetheless, I don’t put 0 credence to these possibilities) 4. There are evidence that within humans, the capacity to suffer has great variety, down to the extreme which some humans barely ever feel pain or suffer, and there were no evidence these vast differences was because of neuron counts.
In any case, my aim for this post is literally to present the number of animals, while also made the case that I expect most of these animals to be either fish or insects. Essentially this leaves the readers to judge how they deduce the importance of farmed animals in their moral and cause prioritization.
Yes, all those first points make sense. I did want to just point to where I see the most likely cruxes.
Re: neuron count, the idea would be to use various transformations of neuron counts, or of a particular type of neuron. I think it’s a judgment call whether to leave it to the readers to judge; I would prefer giving what one thinks is the most plausible benchmark way of counting and then giving the tools to adjust from there, but your approach is sensible too.
Sorry that I missed your comment and therefore the late reply!
Thank you for sharing. Let me clarify your suggestion here, do you mean you suggest me to give my model of accounting for moral significance, rather than just writing about the number of beings involved?
Also, do you mind sharing your credence of the possibility of digital sentience?
Yes, that’s an accurate characterization of my suggestion. Re: digital sentience, intuitively something in the 80-90% range?