Thanks for this comment Michael—I think you make a great point about risks from malevolent actors. In terms of the longermist economic growth aspect, I was thinking more along the lines of institutional quality in the 1600′s explaining a lot of the more recent economic growth trajectories, with substantial consequences for global poverty.
Oh, yes, something I forgot to mention explicitly was that it sounded like you were talking primarily about timescales of centuries, which I don’t think is typically what longtermists are focused on. I think the typical view among longtermists is something like the following: “If things go well, humanity—or whatever we become—could last for such an incredibly long time that even a very small tweak to our trajectory, which lasts a substantial portion of that time, will ultimately matter a huge deal.* And it can matter much more than a larger ‘boost’ that would ultimately ‘wash out’ on the scale of years, decades, or centuries.”
This isn’t to say that economic growth isn’t important for longtermists, but rather that, if it is important to longtermists, that may be primarily because of its effects on other aspects of our trajectory. E.g., existential risk. (And it’s currently not totally clear whether it’s good or bad for x-risk, though I think the evidence leans somewhat towards it being good; see e.g. Existential Risk and Economic Growth. Other sources are linked to from my crucial questions series.)
Though growth could also matter more “directly”, because a faster spread to the stars may reduce the ultimate astronomical waste. (There are also longtermists who may not care about astronomical waste, such as suffering-focused longtermists.)
In any case, the way you made the argument felt more “medium-termist” than “longtermist” to me. I share that feeling partly because it may provide useful info regarding how persuasive other longtermists would find that argument, and whether they feel it’s really a “longtermist” argument.
*If you want to be mathy, you can think of this as the area between two slightly different curves ultimately being very large, if we travel a far enough distance along the x axis.
Thanks for this comment Michael—I think you make a great point about risks from malevolent actors. In terms of the longermist economic growth aspect, I was thinking more along the lines of institutional quality in the 1600′s explaining a lot of the more recent economic growth trajectories, with substantial consequences for global poverty.
Oh, yes, something I forgot to mention explicitly was that it sounded like you were talking primarily about timescales of centuries, which I don’t think is typically what longtermists are focused on. I think the typical view among longtermists is something like the following: “If things go well, humanity—or whatever we become—could last for such an incredibly long time that even a very small tweak to our trajectory, which lasts a substantial portion of that time, will ultimately matter a huge deal.* And it can matter much more than a larger ‘boost’ that would ultimately ‘wash out’ on the scale of years, decades, or centuries.”
This isn’t to say that economic growth isn’t important for longtermists, but rather that, if it is important to longtermists, that may be primarily because of its effects on other aspects of our trajectory. E.g., existential risk. (And it’s currently not totally clear whether it’s good or bad for x-risk, though I think the evidence leans somewhat towards it being good; see e.g. Existential Risk and Economic Growth. Other sources are linked to from my crucial questions series.)
Though growth could also matter more “directly”, because a faster spread to the stars may reduce the ultimate astronomical waste. (There are also longtermists who may not care about astronomical waste, such as suffering-focused longtermists.)
In any case, the way you made the argument felt more “medium-termist” than “longtermist” to me. I share that feeling partly because it may provide useful info regarding how persuasive other longtermists would find that argument, and whether they feel it’s really a “longtermist” argument.
*If you want to be mathy, you can think of this as the area between two slightly different curves ultimately being very large, if we travel a far enough distance along the x axis.