The result will be a singularity, understood as a fundamental discontinuity in human history beyond which our fate depends largely on how we interact with artificial agents
The discontinuity is a result of humans no longer being the smartest agents in the world, and no longer being in control of our own fate. After this point, we’ve entered an event horizon where the output is almost entirely unforeseeable.
If you have accelerating growth that isn’t sustained for very long, you get something like population growth from 1800-2000
If, after surpassing humans, intelligence “grows” exponentially for another 200 years, do you not think we’ve passed an event horizon? I certainly do!
If not, using the metric of single agent intelligence (i.e. not the sum of intelligence in a group of agents), at what point during an exponential growth curve that intersects human level intelligence, would you defining as crossing the event horizon?
As you write:
The discontinuity is a result of humans no longer being the smartest agents in the world, and no longer being in control of our own fate. After this point, we’ve entered an event horizon where the output is almost entirely unforeseeable.
If, after surpassing humans, intelligence “grows” exponentially for another 200 years, do you not think we’ve passed an event horizon? I certainly do!
If not, using the metric of single agent intelligence (i.e. not the sum of intelligence in a group of agents), at what point during an exponential growth curve that intersects human level intelligence, would you defining as crossing the event horizon?