I feel like I’m taking crazy pills.
It appears that many EAs believe we shouldn’t pause AI capabilities until it can be proven to have < ~ 0.1% chance of X-risk.
Put less confusingly, it appears many EAs believe we should allow capabilities development to continue despite the current X-risks.
This feels obviously a terrible thing to me.
What are the best reasons EA shouldn’t be pushing for an indefinite pause on AI capabilities development??
[Question] Am I taking crazy pills? Why aren’t EAs advocating for a pause on AI capabilities?
I feel like I’m taking crazy pills.
It appears that many EAs believe we shouldn’t pause AI capabilities until it can be proven to have < ~ 0.1% chance of X-risk.
Put less confusingly, it appears many EAs believe we should allow capabilities development to continue despite the current X-risks.
This feels obviously a terrible thing to me.
What are the best reasons EA shouldn’t be pushing for an indefinite pause on AI capabilities development??