60%: If humans stay biological, itās very hard for me to imagine in the long run ASI with its vastly superior intelligence and processing speed still taking direction from feeble humans. I think if we could get human brain emulations going before AGI got too powerful, perhaps by banning ASI until it is safe, then we have some chance. You can see for someone like me with much lower P(catastrophe|AGI) than disempowerment why itās very important to know whether disempowerment is considered doom!
60%: If humans stay biological, itās very hard for me to imagine in the long run ASI with its vastly superior intelligence and processing speed still taking direction from feeble humans. I think if we could get human brain emulations going before AGI got too powerful, perhaps by banning ASI until it is safe, then we have some chance. You can see for someone like me with much lower P(catastrophe|AGI) than disempowerment why itās very important to know whether disempowerment is considered doom!