I’ll have to think about that. I’ve been working on a response, but on consideration, perhaps it’s best to reserve “utilitarianism” for the act of evaluating world-states according to overall sentient affinity for those states.
Utilitarianism might say that X is bad insofar as people experience the badnesss of X. The sum total of badness that people subjectively experience from X determines how bad it is.
Deontology would reject that idea.
And it might be useful to have utilitarianism refuse to accept that “deontology might have a point,” and vice versa.
Can you please explain how utilitarianism factors in moral uncertainty?
As far as I’m aware it has nothing to say on the matter.
I’ll have to think about that. I’ve been working on a response, but on consideration, perhaps it’s best to reserve “utilitarianism” for the act of evaluating world-states according to overall sentient affinity for those states.
Utilitarianism might say that X is bad insofar as people experience the badnesss of X. The sum total of badness that people subjectively experience from X determines how bad it is.
Deontology would reject that idea.
And it might be useful to have utilitarianism refuse to accept that “deontology might have a point,” and vice versa.