Just because an event is theoretical doesn’t mean that it won’t occur. An asteroid hitting the Earth is theoretical, but something I think you might realize is quite real when it impacts.
Some say that superintelligence doesn’t have precedence, but I think that’s overlooking a key fact. The rise of homo sapiens has radically altered the world—and all signs point toward intelligence as the cause. We think at the moment that intelligence is just a matter of information processing, and therefore, there should be a way that it could be done by our own computers some day, if only we figured out the right algorithms to implement.
If we learn that superintelligence is impossible, that means our current most descriptive scientific theories are wrong, and we will have learned something new. That’s because that would indicate that humans are somehow cosmically special, or at least have hit the ceiling for general intelligence. On the flipside, if we create superintelligence, none of our current theories of how the world operates must be wrong.
That’s why it’s important to take seriously. Because the best evidence we have available tells us that it’s possible, not that it’s impossible.
Just because an event is theoretical doesn’t mean that it won’t occur. An asteroid hitting the Earth is theoretical, but something I think you might realize is quite real when it impacts.
Some say that superintelligence doesn’t have precedence, but I think that’s overlooking a key fact. The rise of homo sapiens has radically altered the world—and all signs point toward intelligence as the cause. We think at the moment that intelligence is just a matter of information processing, and therefore, there should be a way that it could be done by our own computers some day, if only we figured out the right algorithms to implement.
If we learn that superintelligence is impossible, that means our current most descriptive scientific theories are wrong, and we will have learned something new. That’s because that would indicate that humans are somehow cosmically special, or at least have hit the ceiling for general intelligence. On the flipside, if we create superintelligence, none of our current theories of how the world operates must be wrong.
That’s why it’s important to take seriously. Because the best evidence we have available tells us that it’s possible, not that it’s impossible.