People believing utilitarianism could be predictably harmful, even if the theory actually says not to do the relevant harmful things. (Not endorsing this view: I think if you’ve actually spent time socially in academic philosophy, it is hard to believe that people who profess to be utilitarians are systematically more or less trustworthy than anyone else.)
As someone who has doubts about track record arguments for utilitarianism, I want to go on the record as saying I think that cuts both ways – that I don’t think SBF’s actions are a reason to think utilitarianism is false or bad (nor true or good).
Like, in order to evaluate a person’s actions morally we already need a moral theory in place. So the moral theory needs to be grounded in something else (like for example intuitions, human nature and reasoned argument).
Sure, it’s possible that misunderstandings of the theory could prove harmful. I think that’s a good reason to push back against those misunderstandings!
I’m not a fan of the “esoteric” reasoning that says we should hide the truth because people are too apt to misuse it. I grant it’s a conceptual possibility. But, in line with my general wariness of naive utilitarian reasoning, my priors strongly favour norms of openness and truth-seeking as the best way to ward off these problems.
People believing utilitarianism could be predictably harmful, even if the theory actually says not to do the relevant harmful things. (Not endorsing this view: I think if you’ve actually spent time socially in academic philosophy, it is hard to believe that people who profess to be utilitarians are systematically more or less trustworthy than anyone else.)
As someone who has doubts about track record arguments for utilitarianism, I want to go on the record as saying I think that cuts both ways – that I don’t think SBF’s actions are a reason to think utilitarianism is false or bad (nor true or good).
Like, in order to evaluate a person’s actions morally we already need a moral theory in place. So the moral theory needs to be grounded in something else (like for example intuitions, human nature and reasoned argument).
Sure, it’s possible that misunderstandings of the theory could prove harmful. I think that’s a good reason to push back against those misunderstandings!
I’m not a fan of the “esoteric” reasoning that says we should hide the truth because people are too apt to misuse it. I grant it’s a conceptual possibility. But, in line with my general wariness of naive utilitarian reasoning, my priors strongly favour norms of openness and truth-seeking as the best way to ward off these problems.