Thanks for writing this! Thinking on it some more, I wonder about a possible tension between two of your big-picture claims here:
(1) āI think that the puzzles Richard raises are problems for people tempted by very formal theories of morality and action, which come with a host of auxiliary assumptions one may wish to reject.ā
(2) āI simply have judgments about the ranking of various concretely described worlds, and (largely implicit) decision procedures for what to do when presented with the choice of various worlds.ā
The first passage makes it sound like the problem is specifically with a narrow band of āvery formalā theories, involving questionable āauxiliary assumptionsā. This leaves open that we could still secure a complete ethical theory, just one that isnāt on board with all the specific āassumptionsā made by the āvery formalā theories. (E.g. Someone might hope that appeal to incommensurability or other ways of understanding value non-numerically might help here.)
But the second passage, which maybe fits better with the actual argument of the post, suggests to me that what youāre really recommending is that we abandon ethical theory (as traditionally understood), and embrace the alternative task of settling on a decision procedure that weāre happy to endorse across a fairly wide range of circumstances.
E.g. I take it that a central task of decision theory is to provide a criterion that specifies which gambles are or arenāt worth taking. When you write, āWhat would I actually do in that situation? In practice, I think Iād probably decide to take actions via the following decision-procedure...ā it seems to me that you arenāt answering the same question that decision theorists are asking. Youāre not giving an alternative criterion; instead youāre rejecting the idea that we need one.
So, do you think itās fair to interpret your post as effectively arguing that maybe we donāt need theory? āPuzzles for everyoneā was arguing that these were puzzles for every theory. But you could reasonably question whether we need to engage in the project of moral/ādecision theory in the first place. Thatās a big question! (For some basic considerations on the pro-theory side, see āWhy Do We Need Moral Theories?ā on utilitarianism.net.)
But if Iām completely missing the boat here, let me know.
[ETA: on the object-level issues, Iām very sympathetic to Alexās response. Iād probably go with ātake exactly Nā myself, though it remains puzzling how to justify not going to N+1, and so on.]
Thanks for writing this! Thinking on it some more, I wonder about a possible tension between two of your big-picture claims here:
(1) āI think that the puzzles Richard raises are problems for people tempted by very formal theories of morality and action, which come with a host of auxiliary assumptions one may wish to reject.ā
(2) āI simply have judgments about the ranking of various concretely described worlds, and (largely implicit) decision procedures for what to do when presented with the choice of various worlds.ā
The first passage makes it sound like the problem is specifically with a narrow band of āvery formalā theories, involving questionable āauxiliary assumptionsā. This leaves open that we could still secure a complete ethical theory, just one that isnāt on board with all the specific āassumptionsā made by the āvery formalā theories. (E.g. Someone might hope that appeal to incommensurability or other ways of understanding value non-numerically might help here.)
But the second passage, which maybe fits better with the actual argument of the post, suggests to me that what youāre really recommending is that we abandon ethical theory (as traditionally understood), and embrace the alternative task of settling on a decision procedure that weāre happy to endorse across a fairly wide range of circumstances.
E.g. I take it that a central task of decision theory is to provide a criterion that specifies which gambles are or arenāt worth taking. When you write, āWhat would I actually do in that situation? In practice, I think Iād probably decide to take actions via the following decision-procedure...ā it seems to me that you arenāt answering the same question that decision theorists are asking. Youāre not giving an alternative criterion; instead youāre rejecting the idea that we need one.
So, do you think itās fair to interpret your post as effectively arguing that maybe we donāt need theory? āPuzzles for everyoneā was arguing that these were puzzles for every theory. But you could reasonably question whether we need to engage in the project of moral/ādecision theory in the first place. Thatās a big question! (For some basic considerations on the pro-theory side, see āWhy Do We Need Moral Theories?ā on utilitarianism.net.)
But if Iām completely missing the boat here, let me know.
[ETA: on the object-level issues, Iām very sympathetic to Alexās response. Iād probably go with ātake exactly Nā myself, though it remains puzzling how to justify not going to N+1, and so on.]