This. Most of the other AGI x-risk related ideas on the FTX Future Fund project ideas comment thread. And just generally be allocating about 100-1000x more resources to the problem. Maybe that’s too much for EA as it currently stands. But in an ideal world, more resources would be devoted to AI Alignment than to AI capabilities.
This. Most of the other AGI x-risk related ideas on the FTX Future Fund project ideas comment thread. And just generally be allocating about 100-1000x more resources to the problem. Maybe that’s too much for EA as it currently stands. But in an ideal world, more resources would be devoted to AI Alignment than to AI capabilities.