Thanks for writing this! Thinking on it some more, I wonder about a possible tension between two of your big-picture claims here:
(1) “I think that the puzzles Richard raises are problems for people tempted by very formal theories of morality and action, which come with a host of auxiliary assumptions one may wish to reject.”
(2) “I simply have judgments about the ranking of various concretely described worlds, and (largely implicit) decision procedures for what to do when presented with the choice of various worlds.”
The first passage makes it sound like the problem is specifically with a narrow band of “very formal” theories, involving questionable “auxiliary assumptions”. This leaves open that we could still secure a complete ethical theory, just one that isn’t on board with all the specific “assumptions” made by the “very formal” theories. (E.g. Someone might hope that appeal to incommensurability or other ways of understanding value non-numerically might help here.)
But the second passage, which maybe fits better with the actual argument of the post, suggests to me that what you’re really recommending is that we abandon ethical theory (as traditionally understood), and embrace the alternative task of settling on a decision procedure that we’re happy to endorse across a fairly wide range of circumstances.
E.g. I take it that a central task of decision theory is to provide a criterion that specifies which gambles are or aren’t worth taking. When you write, “What would I actually do in that situation? In practice, I think I’d probably decide to take actions via the following decision-procedure...” it seems to me that you aren’t answering the same question that decision theorists are asking. You’re not giving an alternative criterion; instead you’re rejecting the idea that we need one.
So, do you think it’s fair to interpret your post as effectively arguing that maybe we don’t need theory? “Puzzles for everyone” was arguing that these were puzzles for every theory. But you could reasonably question whether we need to engage in the project of moral/decision theory in the first place. That’s a big question! (For some basic considerations on the pro-theory side, see ‘Why Do We Need Moral Theories?’ on utilitarianism.net.)
But if I’m completely missing the boat here, let me know.
[ETA: on the object-level issues, I’m very sympathetic to Alex’s response. I’d probably go with “take exactly N” myself, though it remains puzzling how to justify not going to N+1, and so on.]
Thanks for writing this! Thinking on it some more, I wonder about a possible tension between two of your big-picture claims here:
(1) “I think that the puzzles Richard raises are problems for people tempted by very formal theories of morality and action, which come with a host of auxiliary assumptions one may wish to reject.”
(2) “I simply have judgments about the ranking of various concretely described worlds, and (largely implicit) decision procedures for what to do when presented with the choice of various worlds.”
The first passage makes it sound like the problem is specifically with a narrow band of “very formal” theories, involving questionable “auxiliary assumptions”. This leaves open that we could still secure a complete ethical theory, just one that isn’t on board with all the specific “assumptions” made by the “very formal” theories. (E.g. Someone might hope that appeal to incommensurability or other ways of understanding value non-numerically might help here.)
But the second passage, which maybe fits better with the actual argument of the post, suggests to me that what you’re really recommending is that we abandon ethical theory (as traditionally understood), and embrace the alternative task of settling on a decision procedure that we’re happy to endorse across a fairly wide range of circumstances.
E.g. I take it that a central task of decision theory is to provide a criterion that specifies which gambles are or aren’t worth taking. When you write, “What would I actually do in that situation? In practice, I think I’d probably decide to take actions via the following decision-procedure...” it seems to me that you aren’t answering the same question that decision theorists are asking. You’re not giving an alternative criterion; instead you’re rejecting the idea that we need one.
So, do you think it’s fair to interpret your post as effectively arguing that maybe we don’t need theory? “Puzzles for everyone” was arguing that these were puzzles for every theory. But you could reasonably question whether we need to engage in the project of moral/decision theory in the first place. That’s a big question! (For some basic considerations on the pro-theory side, see ‘Why Do We Need Moral Theories?’ on utilitarianism.net.)
But if I’m completely missing the boat here, let me know.
[ETA: on the object-level issues, I’m very sympathetic to Alex’s response. I’d probably go with “take exactly N” myself, though it remains puzzling how to justify not going to N+1, and so on.]