Does XR consider tech progress default-good or default-bad?
Leopold Aschenbrennerâs paper Existential risk and growth provides one interesting perspective on this question (note that while I find the paper informative, I donât think it settles the question).
A key question the paper seeks to address is this:
Does faster economic growth accelerate the development of dangerous newtechnologies, thereby increasing the probability of an existential catastrophe?
The paperâs (preliminary) conclusion is
we could be living in a unique âtime of perils,â having developed technologies advanced enough to threaten our permanent destruction, but not having grown wealthy enough yet to be willing to spend sufficiently on safety. Accelerating growth during this âtime of perilsâ initially increases risk, but improves the chances of humanityâs survival in the long run. Conversely, even short-term stagnation could substantially curtail the future of humanity.
Regarding your question:
Leopold Aschenbrennerâs paper Existential risk and growth provides one interesting perspective on this question (note that while I find the paper informative, I donât think it settles the question).
A key question the paper seeks to address is this:
The paperâs (preliminary) conclusion is
Aschenbrennerâs model strikes me as a synthesis of the two intellectual programmes, and it doesnât get enough attention.