Ah, yeah I misread your opinion of the likelihood that humans will ever create AGI. I believe it will happen eventually unless AI research stops due to some exogenous reason (civilizational collapse, a ban on development, etc.). Important assumptions I am making:
General Intelligence is all computation, so it isn’t substrate-dependent
The more powerful an AI is the more economically valuable it is to the creators
Moore’s Law will continue so more computing will be available.
If other approaches fail, we will be able to simulate brains with sufficient compute.
Fully simulated brains will be AGI.
I’m not saying that I think this would be the best, easiest, or only way to create AGI, just that if every other attempt fails, I don’t see what would prevent this from happening. Particularly since we are already to simulate portions of a mouse brain. I am also not claiming here that this implies short timelines for AGI. I don’t have a good estimate of how long this approach would take.
Ah, yeah I misread your opinion of the likelihood that humans will ever create AGI. I believe it will happen eventually unless AI research stops due to some exogenous reason (civilizational collapse, a ban on development, etc.). Important assumptions I am making:
General Intelligence is all computation, so it isn’t substrate-dependent
The more powerful an AI is the more economically valuable it is to the creators
Moore’s Law will continue so more computing will be available.
If other approaches fail, we will be able to simulate brains with sufficient compute.
Fully simulated brains will be AGI.
I’m not saying that I think this would be the best, easiest, or only way to create AGI, just that if every other attempt fails, I don’t see what would prevent this from happening. Particularly since we are already to simulate portions of a mouse brain. I am also not claiming here that this implies short timelines for AGI. I don’t have a good estimate of how long this approach would take.