Thanks for writing this! Thinking on it some more, I wonder about a possible tension between two of your big-picture claims here:
(1) âI think that the puzzles Richard raises are problems for people tempted by very formal theories of morality and action, which come with a host of auxiliary assumptions one may wish to reject.â
(2) âI simply have judgments about the ranking of various concretely described worlds, and (largely implicit) decision procedures for what to do when presented with the choice of various worlds.â
The first passage makes it sound like the problem is specifically with a narrow band of âvery formalâ theories, involving questionable âauxiliary assumptionsâ. This leaves open that we could still secure a complete ethical theory, just one that isnât on board with all the specific âassumptionsâ made by the âvery formalâ theories. (E.g. Someone might hope that appeal to incommensurability or other ways of understanding value non-numerically might help here.)
But the second passage, which maybe fits better with the actual argument of the post, suggests to me that what youâre really recommending is that we abandon ethical theory (as traditionally understood), and embrace the alternative task of settling on a decision procedure that weâre happy to endorse across a fairly wide range of circumstances.
E.g. I take it that a central task of decision theory is to provide a criterion that specifies which gambles are or arenât worth taking. When you write, âWhat would I actually do in that situation? In practice, I think Iâd probably decide to take actions via the following decision-procedure...â it seems to me that you arenât answering the same question that decision theorists are asking. Youâre not giving an alternative criterion; instead youâre rejecting the idea that we need one.
So, do you think itâs fair to interpret your post as effectively arguing that maybe we donât need theory? âPuzzles for everyoneâ was arguing that these were puzzles for every theory. But you could reasonably question whether we need to engage in the project of moral/âdecision theory in the first place. Thatâs a big question! (For some basic considerations on the pro-theory side, see âWhy Do We Need Moral Theories?â on utilitarianism.net.)
But if Iâm completely missing the boat here, let me know.
[ETA: on the object-level issues, Iâm very sympathetic to Alexâs response. Iâd probably go with âtake exactly Nâ myself, though it remains puzzling how to justify not going to N+1, and so on.]
Thanks for writing this! Thinking on it some more, I wonder about a possible tension between two of your big-picture claims here:
(1) âI think that the puzzles Richard raises are problems for people tempted by very formal theories of morality and action, which come with a host of auxiliary assumptions one may wish to reject.â
(2) âI simply have judgments about the ranking of various concretely described worlds, and (largely implicit) decision procedures for what to do when presented with the choice of various worlds.â
The first passage makes it sound like the problem is specifically with a narrow band of âvery formalâ theories, involving questionable âauxiliary assumptionsâ. This leaves open that we could still secure a complete ethical theory, just one that isnât on board with all the specific âassumptionsâ made by the âvery formalâ theories. (E.g. Someone might hope that appeal to incommensurability or other ways of understanding value non-numerically might help here.)
But the second passage, which maybe fits better with the actual argument of the post, suggests to me that what youâre really recommending is that we abandon ethical theory (as traditionally understood), and embrace the alternative task of settling on a decision procedure that weâre happy to endorse across a fairly wide range of circumstances.
E.g. I take it that a central task of decision theory is to provide a criterion that specifies which gambles are or arenât worth taking. When you write, âWhat would I actually do in that situation? In practice, I think Iâd probably decide to take actions via the following decision-procedure...â it seems to me that you arenât answering the same question that decision theorists are asking. Youâre not giving an alternative criterion; instead youâre rejecting the idea that we need one.
So, do you think itâs fair to interpret your post as effectively arguing that maybe we donât need theory? âPuzzles for everyoneâ was arguing that these were puzzles for every theory. But you could reasonably question whether we need to engage in the project of moral/âdecision theory in the first place. Thatâs a big question! (For some basic considerations on the pro-theory side, see âWhy Do We Need Moral Theories?â on utilitarianism.net.)
But if Iâm completely missing the boat here, let me know.
[ETA: on the object-level issues, Iâm very sympathetic to Alexâs response. Iâd probably go with âtake exactly Nâ myself, though it remains puzzling how to justify not going to N+1, and so on.]