Shower thoughts: AI has advanced its intelligence so fast by running thousands of iterations of training. In a way, it has lived a thousand lifetimes during our human lifespan. If each training run was one life, could that be analogous to one human life? If AGI has a survival instinct, could that be analogous to the drive for the survival of the human race as a species? Does that then change the way to look at control or coexistence mechanisms with AGI?
Despite the enormous number of learning AGI has accomplished, It hasn’t produced something similar to a replacement consciousness. I guess the current trajectory is useful to other areas of intelligence—but not a replacement to our human cognitive capacity.
Shower thoughts: AI has advanced its intelligence so fast by running thousands of iterations of training. In a way, it has lived a thousand lifetimes during our human lifespan. If each training run was one life, could that be analogous to one human life? If AGI has a survival instinct, could that be analogous to the drive for the survival of the human race as a species? Does that then change the way to look at control or coexistence mechanisms with AGI?
Oh, so apparently this is called the “Second Species” theory. I’ll need to read more on it.
Despite the enormous number of learning AGI has accomplished, It hasn’t produced something similar to a replacement consciousness. I guess the current trajectory is useful to other areas of intelligence—but not a replacement to our human cognitive capacity.