Respectfully, I believe that facts and evidence support a doomsday scenario. As example, every human civilization ever created has eventually collapsed. Anyone proposing this won’t happen to our current civilization too would bear a very heavy burden.
The accelerating knowledge explosion we’re experiencing is built upon the assumption that human beings can manage any amount of knowledge and power delivered at any rate. This is an extremely ambitious claim, especially when we reflect on the thousands of massive hydrogen bombs we currently have aimed down our own throats.
To find optimism in this view we have to shift our focus to much longer time scales. What I see coming is similar to what happened to the Roman Empire. That empire collapsed from it’s own internal contradictions, a period of darkness followed, and then a new more advanced civilization emerged from the ashes.
Apologies, I really have no complaint with your article, which seems very intelligently presented. But I will admit I believe your focus to be too narrow. That is debatable of course, and counter challenges are sincerely welcomed.
Again, I agree with you regarding the reality that every civilisation has eventually collapsed. I personally also agree that it doesn’t currently seem likely that our ‘modern globalised’ civilisation won’t collapse, though I’m no expert on the matter.
I have no particular insight about how comparable the collapse of the Roman Empire is to the coming decades of human existence.
I agree that amidst all the existential threats to humankind, the content of this article is quite narrow.
Apologies, it’s really not my intent to hijack your thread. I do hope that others will engage you on the subject you’ve outlined in your article. I agree I should probably write my own article about my own interests.
I can’t seem to help it. Every time I see an article about managing AI or genetic engineering etc I feel compelled to point out that the attempt to manage such emerging technologies one by one by one is a game of wack-a-mole that we are destined to lose. Not only is the scale of powers involved in these technologies vast, but ever more of them, of ever greater power, will come online at an ever accelerating pace.
What I hope to see more of are articles which address how we might learn to manage the machinery creating all these multiplying threats, the ever accelerating knowledge explosion.
Ok, I’ll leave it there, and get to work writing my own articles, which I hope you might challenge in turn.
Thanks much for your engagement Madhav.
Respectfully, I believe that facts and evidence support a doomsday scenario. As example, every human civilization ever created has eventually collapsed. Anyone proposing this won’t happen to our current civilization too would bear a very heavy burden.
The accelerating knowledge explosion we’re experiencing is built upon the assumption that human beings can manage any amount of knowledge and power delivered at any rate. This is an extremely ambitious claim, especially when we reflect on the thousands of massive hydrogen bombs we currently have aimed down our own throats.
To find optimism in this view we have to shift our focus to much longer time scales. What I see coming is similar to what happened to the Roman Empire. That empire collapsed from it’s own internal contradictions, a period of darkness followed, and then a new more advanced civilization emerged from the ashes.
Apologies, I really have no complaint with your article, which seems very intelligently presented. But I will admit I believe your focus to be too narrow. That is debatable of course, and counter challenges are sincerely welcomed.
Again, I agree with you regarding the reality that every civilisation has eventually collapsed. I personally also agree that it doesn’t currently seem likely that our ‘modern globalised’ civilisation won’t collapse, though I’m no expert on the matter.
I have no particular insight about how comparable the collapse of the Roman Empire is to the coming decades of human existence.
I agree that amidst all the existential threats to humankind, the content of this article is quite narrow.
Apologies, it’s really not my intent to hijack your thread. I do hope that others will engage you on the subject you’ve outlined in your article. I agree I should probably write my own article about my own interests.
I can’t seem to help it. Every time I see an article about managing AI or genetic engineering etc I feel compelled to point out that the attempt to manage such emerging technologies one by one by one is a game of wack-a-mole that we are destined to lose. Not only is the scale of powers involved in these technologies vast, but ever more of them, of ever greater power, will come online at an ever accelerating pace.
What I hope to see more of are articles which address how we might learn to manage the machinery creating all these multiplying threats, the ever accelerating knowledge explosion.
Ok, I’ll leave it there, and get to work writing my own articles, which I hope you might challenge in turn.