Second, I might be mistaken about what this agent’s choice would be. For instance, perhaps the lake is so cold that the pain of jumping in is of greater moral importance than any happiness I obtain.
Yeah, I think this is pretty plausible at least for sufficiently horrible forms of suffering (and probably all forms, upon reflection on how bad the alternative moral views are IMO). I doubt my common sense intuitions about bundles of happiness and suffering can properly empathize, in my state of current comfort, with the suffering-moments.
But given you said the point above, I’m a bit surprised you also said this:
One of the following three things is true:
(1) One would not accept a week of the worst torture conceptually possible in exchange for an arbitrarily large amount of happiness for an arbitrarily long time.
(2) One would not accept such a trade, but believes that a perfectly rational, self-interested hedonist would accept it …
(3) One would accept such a trade, and further this belief is predicated on the existence of compelling arguments in favor of proposition (i).
What about “(4): One would accept such a trade, but believes that a perfectly rational, self-interested hedonist would not accept it”?
Yeah, I think this is pretty plausible at least for sufficiently horrible forms of suffering (and probably all forms, upon reflection on how bad the alternative moral views are IMO). I doubt my common sense intuitions about bundles of happiness and suffering can properly empathize, in my state of current comfort, with the suffering-moments.
But given you said the point above, I’m a bit surprised you also said this:
What about “(4): One would accept such a trade, but believes that a perfectly rational, self-interested hedonist would not accept it”?
Yeah...I think I just forgot to add that? Although it seems the least (empirically) likely of the four possibilities