This is true, but we could all be mistaken. This doesn’t seem unlikely to me, considering that our brains simply were not built to handle such incredibly small probabilities and incredibly large magnitudes of disutility. That said, I won’t practically bite the bullet, any more than people who would choose torture over dust specks probably do, or any more than pure impartial consequentialists truly sacrifice all their own frivolities for altruism. (This latter case is often excused as just avoiding burnout, but I seriously doubt the level of self-indulgence of the average consequentialist EA, myself included, is anywhere close to altruistically optimal.)
In general—and this is something I seem to disagree with many in this community about—I think following your ethics or decision theory through to its honest conclusions tends to make more sense than assuming the status quo is probably close to optimal. There is of course some reflective equilibrium involved here; sometimes I do revise my understanding of the ethical/decision theory.
This is similar to how you might dismiss this proof that 1+1=3 even if you cannot see the error.
To the extent that I assign nonzero probability to mathematically absurd statements (based on precedents like these), I don’t think there’s very high disutility in acting as if 1+1=2 in a world where it’s actually true that 1+1=3. But that could be a failure of my imagination.
It is however a bit of a dissatisfying answer as it is not very rigorous, it is unclear when a conclusion is so absurd as to require outright objection.
This is basically my response. I think there’s some meaningful distinction between good applications of reductio ad absurdum and relatively hollow appeals to “common sense,” though, and the dismissal of Pascal’s mugging strikes me as more the latter.
For example you could worry about future weapons technology that could destroy the world and try to explore what this would look like – but you can safely say it is very unlikely to look like your explorations.
I’m not sure I follow how this helps. People who accept giving into Pascal’s mugger don’t dispute that the very bad scenario in question is “very unlikely.”
This might allow you to avoid the pascal mugger and invest appropriate time into more general more flexible evil wizard protection.
I think you might be onto something here, but I’d need the details fleshed out because I don’t quite understand the claim.
Thanks for your reply! :)
This is true, but we could all be mistaken. This doesn’t seem unlikely to me, considering that our brains simply were not built to handle such incredibly small probabilities and incredibly large magnitudes of disutility. That said, I won’t practically bite the bullet, any more than people who would choose torture over dust specks probably do, or any more than pure impartial consequentialists truly sacrifice all their own frivolities for altruism. (This latter case is often excused as just avoiding burnout, but I seriously doubt the level of self-indulgence of the average consequentialist EA, myself included, is anywhere close to altruistically optimal.)
In general—and this is something I seem to disagree with many in this community about—I think following your ethics or decision theory through to its honest conclusions tends to make more sense than assuming the status quo is probably close to optimal. There is of course some reflective equilibrium involved here; sometimes I do revise my understanding of the ethical/decision theory.
To the extent that I assign nonzero probability to mathematically absurd statements (based on precedents like these), I don’t think there’s very high disutility in acting as if 1+1=2 in a world where it’s actually true that 1+1=3. But that could be a failure of my imagination.
This is basically my response. I think there’s some meaningful distinction between good applications of reductio ad absurdum and relatively hollow appeals to “common sense,” though, and the dismissal of Pascal’s mugging strikes me as more the latter.
I’m not sure I follow how this helps. People who accept giving into Pascal’s mugger don’t dispute that the very bad scenario in question is “very unlikely.”
I think you might be onto something here, but I’d need the details fleshed out because I don’t quite understand the claim.