Hey Thomas! Love the feedback & follow up form the conversation. Thanks for taking so much time to think this over—this is really well-researched. :)
In response to your arguments:
1 → 2 is generally well established by climate literature. I think the quote you provided gives me good reasons for why climate war may not be perfectly rational; however, humans don’t act in a perfectly rational way.
There are clear historical correlations that exist between rainfall patterns and civil tensions, expert opinions on climate causing violent conflict, etc. I’d like to reemphasize that climate conflict is often not just driven by resource scarcity dynamics, but also amplified by the irrational mentalities (e.g. they’ve stolen from us, they hate us, us vs them) that has driven humanity to the state of war for the many decades before. There is a unique blend of rational and irrational calculations that play into conflict risk.
2 → 3 → 4 is absolutely tenuous because our systems have rarely been stressed to this extent, so little to no historical precedence exists. However, this climate tension also acts in non-linear ways with other elements of technological development—e.g. international AGI governance efforts may be significantly harder to do between politically extreme governments and in the context of rising social tension.
To address the “greatest risk” point for 3 → 4, I agree and/or concede because my opinions have changed since the time I’ve written this as I’ve talked to more researchers in the AI alignment space.
From linkchain framing to systems thinking:
This specific 1->2->3->4 pathway causing directly existential risk may feel unlikely—and it is (alone). However, the emphasis I’d like to make is that there is a category of (usually politically-related risks) that have the potential to cascade through systems in a rather dangerous, non-linear, volatile manner.
These systemic cascading risks are better visualized not as a linear linkchain where A affects B affects C affects D (because this only captures one possible linkage chain and no interwoven or cascading effects), but rather as a graph of interconnected socioeconomic systems where one stresses a subset of nodes and studies how this stressor affects the system. How strong the butterfly effect is depends on the vulnerability and resiliency of its institutions; thus, I aim to advocate for more resilient institutions to counter these risks.
Hey Thomas! Love the feedback & follow up form the conversation. Thanks for taking so much time to think this over—this is really well-researched. :)
In response to your arguments:
1 → 2 is generally well established by climate literature. I think the quote you provided gives me good reasons for why climate war may not be perfectly rational; however, humans don’t act in a perfectly rational way.
There are clear historical correlations that exist between rainfall patterns and civil tensions, expert opinions on climate causing violent conflict, etc. I’d like to reemphasize that climate conflict is often not just driven by resource scarcity dynamics, but also amplified by the irrational mentalities (e.g. they’ve stolen from us, they hate us, us vs them) that has driven humanity to the state of war for the many decades before. There is a unique blend of rational and irrational calculations that play into conflict risk.
2 → 3 → 4 is absolutely tenuous because our systems have rarely been stressed to this extent, so little to no historical precedence exists. However, this climate tension also acts in non-linear ways with other elements of technological development—e.g. international AGI governance efforts may be significantly harder to do between politically extreme governments and in the context of rising social tension.
To address the “greatest risk” point for 3 → 4, I agree and/or concede because my opinions have changed since the time I’ve written this as I’ve talked to more researchers in the AI alignment space.
From linkchain framing to systems thinking:
This specific 1->2->3->4 pathway causing directly existential risk may feel unlikely—and it is (alone). However, the emphasis I’d like to make is that there is a category of (usually politically-related risks) that have the potential to cascade through systems in a rather dangerous, non-linear, volatile manner.
These systemic cascading risks are better visualized not as a linear linkchain where A affects B affects C affects D (because this only captures one possible linkage chain and no interwoven or cascading effects), but rather as a graph of interconnected socioeconomic systems where one stresses a subset of nodes and studies how this stressor affects the system. How strong the butterfly effect is depends on the vulnerability and resiliency of its institutions; thus, I aim to advocate for more resilient institutions to counter these risks.