My suggestion, as always, would be to shift some focus from particular technological threats such as AI and genetic engineering, to the ever accelerating knowledge explosion which is the source of these threats.
Imagine if you will that a perfect solution is found to the threats presented by AI and genetic engineering. That sounds great at first, but really, so what?
The ever accelerating knowledge explosion will continue to generate ever more powers, of ever greater scale, at an ever faster pace. So long as the focus is on managing particular threats, and not the mechanism which is generating all the threats, then we are engaged in a game of wack-a-mole which we will sooner or later lose.
Imho, the underlying problem uniting all these threats is not technical, but philosophical. As a culture we’re clinging to a “more is better” relationship with knowledge which was entirely rational in the long era of knowledge scarcity, and blindly assuming that this “more is better” paradigm is still rational in an entirely different new era characterized by knowledge exploding in every direction at an ever accelerating rate.
We’re failing to adapt to the revolutionary new environment created by the success of the knowledge explosion. Continuing to push the knowledge explosion forward faster and faster without limit is not adapting to the future, it’s clinging to the past.
I’ll take a lot more interest in EA if I’m able to find anyone discussing the threats we face from this perspective.
My suggestion, as always, would be to shift some focus from particular technological threats such as AI and genetic engineering, to the ever accelerating knowledge explosion which is the source of these threats.
Imagine if you will that a perfect solution is found to the threats presented by AI and genetic engineering. That sounds great at first, but really, so what?
The ever accelerating knowledge explosion will continue to generate ever more powers, of ever greater scale, at an ever faster pace. So long as the focus is on managing particular threats, and not the mechanism which is generating all the threats, then we are engaged in a game of wack-a-mole which we will sooner or later lose.
Imho, the underlying problem uniting all these threats is not technical, but philosophical. As a culture we’re clinging to a “more is better” relationship with knowledge which was entirely rational in the long era of knowledge scarcity, and blindly assuming that this “more is better” paradigm is still rational in an entirely different new era characterized by knowledge exploding in every direction at an ever accelerating rate.
We’re failing to adapt to the revolutionary new environment created by the success of the knowledge explosion. Continuing to push the knowledge explosion forward faster and faster without limit is not adapting to the future, it’s clinging to the past.
I’ll take a lot more interest in EA if I’m able to find anyone discussing the threats we face from this perspective.