Does XR consider tech progress default-good or default-bad?
Leopold Aschenbrenner’s paper Existential risk and growth provides one interesting perspective on this question (note that while I find the paper informative, I don’t think it settles the question).
A key question the paper seeks to address is this:
Does faster economic growth accelerate the development of dangerous newtechnologies, thereby increasing the probability of an existential catastrophe?
The paper’s (preliminary) conclusion is
we could be living in a unique “time of perils,” having developed technologies advanced enough to threaten our permanent destruction, but not having grown wealthy enough yet to be willing to spend sufficiently on safety. Accelerating growth during this “time of perils” initially increases risk, but improves the chances of humanity’s survival in the long run. Conversely, even short-term stagnation could substantially curtail the future of humanity.
Regarding your question:
Leopold Aschenbrenner’s paper Existential risk and growth provides one interesting perspective on this question (note that while I find the paper informative, I don’t think it settles the question).
A key question the paper seeks to address is this:
The paper’s (preliminary) conclusion is
Aschenbrenner’s model strikes me as a synthesis of the two intellectual programmes, and it doesn’t get enough attention.