Generally speaking, I would suggest a shift of focus away from particular risks which arise from emerging technologies, and towards the machinery which is generating all such risks, an ever accelerating knowledge explosion.
It’s natural to see a particular risk and wish to do something about it. But such a limited focus is not really fully rational once we realize that it doesn’t really matter if we remove one particular existential risk unless we can remove them all. As example, if I knew how to make genetic engineering fully safe why would that matter if we then go on to have a nuclear war?
It’s a logic failure to assume, as seemingly almost all “experts” do, that we can continue to enthusiastically fuel an ever accelerating knowledge explosion and then somehow successfully manage every existential risk which emerges from that process, every day forever.
We’re failing to grasp what the concept of acceleration actually means. It means that if the knowledge explosion is going at, say, 50mph today, tomorrow it will be 75mph, and then 150mph, and then 300mph etc. Sooner or later this accelerating process of power accumulation will exceed the human ability to manage. No one can predict exactly when or how we’ll crash the system, but simple common sense logic demonstrates it will happen eventually on our current course.
The “experts” would have us focus on the details of particular emerging technological threats. The experts are wrong. What we need to be focused on instead is the knowledge explosion assembly line which is generating all the threats.
Generally speaking, I would suggest a shift of focus away from particular risks which arise from emerging technologies, and towards the machinery which is generating all such risks, an ever accelerating knowledge explosion.
It’s natural to see a particular risk and wish to do something about it. But such a limited focus is not really fully rational once we realize that it doesn’t really matter if we remove one particular existential risk unless we can remove them all. As example, if I knew how to make genetic engineering fully safe why would that matter if we then go on to have a nuclear war?
It’s a logic failure to assume, as seemingly almost all “experts” do, that we can continue to enthusiastically fuel an ever accelerating knowledge explosion and then somehow successfully manage every existential risk which emerges from that process, every day forever.
We’re failing to grasp what the concept of acceleration actually means. It means that if the knowledge explosion is going at, say, 50mph today, tomorrow it will be 75mph, and then 150mph, and then 300mph etc. Sooner or later this accelerating process of power accumulation will exceed the human ability to manage. No one can predict exactly when or how we’ll crash the system, but simple common sense logic demonstrates it will happen eventually on our current course.
The “experts” would have us focus on the details of particular emerging technological threats. The experts are wrong. What we need to be focused on instead is the knowledge explosion assembly line which is generating all the threats.