I don’t think you need infinities to say that very small probabilities of very big positive (or negative) outcomes messes up utilitarian thinking. (See Pascal’s Mugging or Repugnant Conclusion.)
I agree. One does not even need large numbers nor small probabilities. Complex cluelessness is enough to make the result of any expected value calculation quite unclear. However, not totally arbitrary, so I still endorse expectational total hedonistic utilitarianism.
I agree. One does not even need large numbers nor small probabilities. Complex cluelessness is enough to make the result of any expected value calculation quite unclear. However, not totally arbitrary, so I still endorse expectational total hedonistic utilitarianism.