Again, I agree with you regarding the reality that every civilisation has eventually collapsed. I personally also agree that it doesn’t currently seem likely that our ‘modern globalised’ civilisation won’t collapse, though I’m no expert on the matter.
I have no particular insight about how comparable the collapse of the Roman Empire is to the coming decades of human existence.
I agree that amidst all the existential threats to humankind, the content of this article is quite narrow.
Apologies, it’s really not my intent to hijack your thread. I do hope that others will engage you on the subject you’ve outlined in your article. I agree I should probably write my own article about my own interests.
I can’t seem to help it. Every time I see an article about managing AI or genetic engineering etc I feel compelled to point out that the attempt to manage such emerging technologies one by one by one is a game of wack-a-mole that we are destined to lose. Not only is the scale of powers involved in these technologies vast, but ever more of them, of ever greater power, will come online at an ever accelerating pace.
What I hope to see more of are articles which address how we might learn to manage the machinery creating all these multiplying threats, the ever accelerating knowledge explosion.
Ok, I’ll leave it there, and get to work writing my own articles, which I hope you might challenge in turn.
Again, I agree with you regarding the reality that every civilisation has eventually collapsed. I personally also agree that it doesn’t currently seem likely that our ‘modern globalised’ civilisation won’t collapse, though I’m no expert on the matter.
I have no particular insight about how comparable the collapse of the Roman Empire is to the coming decades of human existence.
I agree that amidst all the existential threats to humankind, the content of this article is quite narrow.
Apologies, it’s really not my intent to hijack your thread. I do hope that others will engage you on the subject you’ve outlined in your article. I agree I should probably write my own article about my own interests.
I can’t seem to help it. Every time I see an article about managing AI or genetic engineering etc I feel compelled to point out that the attempt to manage such emerging technologies one by one by one is a game of wack-a-mole that we are destined to lose. Not only is the scale of powers involved in these technologies vast, but ever more of them, of ever greater power, will come online at an ever accelerating pace.
What I hope to see more of are articles which address how we might learn to manage the machinery creating all these multiplying threats, the ever accelerating knowledge explosion.
Ok, I’ll leave it there, and get to work writing my own articles, which I hope you might challenge in turn.