[Question] Does ‘doing the most good’ also risk ‘doing the most bad’?


The sort of absolutism seems like it could be concerning. For example, things like eugenics, killing off mankind to save animals (I think that was a plot of a James bond film?), and other such clearly negative things all can come out of this type of optimization of trying to do the most good.

It’s kind of like AI-risk, where if you tell an AI to ‘save the most lives’ who knows what it will do. If your ideology is predicated on “doing the most good” it seems you have to be extremely careful to avoid falling into doing the most bad.

In contrast, if you only try to do “some good” or “more good” you can probably find ways to have minimal risk of doing bad and can almost definitely avoid ‘doing the most bad’.

No comments.