I mean: Yeah, having a high impact is important, but don’t throw out “enjoying life” and other things that might not be easily quantifiable. We’re only humans, if we try to forcefully give one consideration 1000x the weight of another, it totally might mess up our judgement in some bad way.
[this doesn’t feel so well phrased, hopefully it still makes some sense. If not, I’ll elaborate]
Yes—this post comes drafting a larger post I’m writing trying to deconstruct EA ideas more generally, and the way that EA really isn’t the same as utilitarianism.
(never spoke about this with anyone, but) I think about this like the classic balance between utilitarianism and deontology, “Go three-quarters of the way from deontology to utilitarianism and then stop”.
I mean: Yeah, having a high impact is important, but don’t throw out “enjoying life” and other things that might not be easily quantifiable. We’re only humans, if we try to forcefully give one consideration 1000x the weight of another, it totally might mess up our judgement in some bad way.
[this doesn’t feel so well phrased, hopefully it still makes some sense. If not, I’ll elaborate]
I think this is roughly right.
(That said, it’s a balance, and three-quarters of the way to 100% EA dedicate-ism will sometimes feel and look quite a lot like crazy sacrifice, IMO.)
Yeah I have no idea if 75% is the correct constant. I mainly read this as “definitely not 100% and also not 99%”
[not a philosopher]
Yes—this post comes drafting a larger post I’m writing trying to deconstruct EA ideas more generally, and the way that EA really isn’t the same as utilitarianism.