Thanks for the helpful summary. I feel it’s worth pointing out that these arguments (which seem strong!) defend only fanaticism per se, but not a stronger claim that is used or assumed when people argue for long-termism. The stronger claim being that we ought to follow Expected Value Maximization. It’s a stronger ask in the sense that we’re asked to take bets not of arbitrarily high payoffs, which can be ‘gamed’ to be high enough to be worth taking, but ‘only’ some specific astronomically high payoffs, which are derived from (as it were) empirically determined information, facts about the universe that ultimately give the payoff upper bounds. That said, it’s helpful to have these arguments to show that ‘longtermism depends on being fanatical’ is not a knock-down argument against longtermism. Here’s one example of that link being made: ”...the case for longtermism may depend either on plausible but non-obvious empirical claims or on a tolerance for Pascalian fanaticism” (Tarsney, 2019).
Thanks for the helpful summary. I feel it’s worth pointing out that these arguments (which seem strong!) defend only fanaticism per se, but not a stronger claim that is used or assumed when people argue for long-termism. The stronger claim being that we ought to follow Expected Value Maximization. It’s a stronger ask in the sense that we’re asked to take bets not of arbitrarily high payoffs, which can be ‘gamed’ to be high enough to be worth taking, but ‘only’ some specific astronomically high payoffs, which are derived from (as it were) empirically determined information, facts about the universe that ultimately give the payoff upper bounds. That said, it’s helpful to have these arguments to show that ‘longtermism depends on being fanatical’ is not a knock-down argument against longtermism. Here’s one example of that link being made: ”...the case for longtermism may depend either on plausible but non-obvious empirical claims or on a tolerance for Pascalian fanaticism” (Tarsney, 2019).