We do trade with our microbiome. We feed it. It helps us digest.
Our microbiome communicates with us. It creates chemical signals that affect digestion and possibly your feeling of hunger. The extent of the influence of your microbiome on your brain is not well known, but the pathways for that influence are. The gut microbiome has been shown to produce various chemicals and signaling molecules that can influence the function of the digestive system and the immune system. It produces short-chain fatty acids that may cross the blood-brain barrier.
Perhaps the analogy here is better than the one with ants.
I mean, there’s an extremely narrow range of final goals for which flesh-and-blood humans are physically optimal infrastructure. Human arms can carry materials, human brains an solve problems, etc.; but if something is keeping us around just for that purpose, and not out of any concern for our welfare, then we’ll inevitably be phased out.
(And in reality, I don’t think AI will ever be at a capability level where it’s strong enough to take control, but not strong enough to benefit in expectation from phasing humans out.)
I think the right takeaway is very clearly “don’t build AGI that has no concern for human welfare”, not “try to be like gut bacteria (or talking ants) to a misaligned AGI”.
>extremely narrow range of final goals for which flesh-and-blood humans are physically optimal
Not so quick there. Currently Al can’t do anything without depending on humans. I have yet to hear an explanation of how the AI rids itself of this dependence.
We do trade with our microbiome. We feed it. It helps us digest.
Our microbiome communicates with us. It creates chemical signals that affect digestion and possibly your feeling of hunger. The extent of the influence of your microbiome on your brain is not well known, but the pathways for that influence are. The gut microbiome has been shown to produce various chemicals and signaling molecules that can influence the function of the digestive system and the immune system. It produces short-chain fatty acids that may cross the blood-brain barrier.
Perhaps the analogy here is better than the one with ants.
Or maybe both analogies are correct? Then the question is how can we be like gut bacteria for the AI and not ants?
Or maybe analogies just add more confusion and we should go back to first principles xd
I mean, there’s an extremely narrow range of final goals for which flesh-and-blood humans are physically optimal infrastructure. Human arms can carry materials, human brains an solve problems, etc.; but if something is keeping us around just for that purpose, and not out of any concern for our welfare, then we’ll inevitably be phased out.
(And in reality, I don’t think AI will ever be at a capability level where it’s strong enough to take control, but not strong enough to benefit in expectation from phasing humans out.)
I think the right takeaway is very clearly “don’t build AGI that has no concern for human welfare”, not “try to be like gut bacteria (or talking ants) to a misaligned AGI”.
>extremely narrow range of final goals for which flesh-and-blood humans are physically optimal
Not so quick there. Currently Al can’t do anything without depending on humans. I have yet to hear an explanation of how the AI rids itself of this dependence.