Although I suspect this is more likely to be false than true, it is not inconceivable that less intelligent animals of a given species could matter more than humans, individually. For example, their experiences, good or bad, could be more intense than ours, or they could experience life more quickly*. They don’t need to have more neurons for this to be true, either, and I am skeptical of the importance of neuron count, too, in part because of this.
*if the rate didn’t matter, you’d run into problems with the theory of relativity: if you are moving very fast compared to another person, you each will see the other as aging more slowly, all else equal. If the rate didn’t matter, then you’d each see the other as mattering more, all else equal, because the other would live longer. Then ethics would have to depend on the frame of reference, which is pretty weird, but perhaps not fatal.
It is really too simple to look at only either a flat, first-order count of neurons OR some other gauge of experience (i.e. suffering and pleasure) and ignore potential higher order effects in my opinion. Perhaps I disagree with many on how utility should be defined.
Although I suspect this is more likely to be false than true, it is not inconceivable that less intelligent animals of a given species could matter more than humans, individually. For example, their experiences, good or bad, could be more intense than ours, or they could experience life more quickly*. They don’t need to have more neurons for this to be true, either, and I am skeptical of the importance of neuron count, too, in part because of this.
*if the rate didn’t matter, you’d run into problems with the theory of relativity: if you are moving very fast compared to another person, you each will see the other as aging more slowly, all else equal. If the rate didn’t matter, then you’d each see the other as mattering more, all else equal, because the other would live longer. Then ethics would have to depend on the frame of reference, which is pretty weird, but perhaps not fatal.
It is really too simple to look at only either a flat, first-order count of neurons OR some other gauge of experience (i.e. suffering and pleasure) and ignore potential higher order effects in my opinion. Perhaps I disagree with many on how utility should be defined.