And from the ronghosh article:
”… All we have to do is fix the loneliness crisis, the fertility crisis, the housing crisis, the obesity crisis, the opioid crisis, the meaning crisis, the meta crisis, the flawed incentives of the political system, the flawed incentives of social media, the flawed incentives of academia, the externalities leading to climate change, the soft wars with China and Iran, the hot war with Russia, income inequality, status inequality, racism, sexism, and every other form of bigotry.”
Of course, as someone who’s steeped in all the AI stuff, I can’t help but just think that A) AI is the most important thing to consider here (ha!), since B) it might allow us (‘alignment’ allowing) to scale the sort of sense-making and problem-solving cognition to help solve all the problems that we’re seemingly increasingly making for ourselves. And yeah this is reductionist and probably naive.
And from the ronghosh article: ”… All we have to do is fix the loneliness crisis, the fertility crisis, the housing crisis, the obesity crisis, the opioid crisis, the meaning crisis, the meta crisis, the flawed incentives of the political system, the flawed incentives of social media, the flawed incentives of academia, the externalities leading to climate change, the soft wars with China and Iran, the hot war with Russia, income inequality, status inequality, racism, sexism, and every other form of bigotry.”
Of course, as someone who’s steeped in all the AI stuff, I can’t help but just think that A) AI is the most important thing to consider here (ha!), since B) it might allow us (‘alignment’ allowing) to scale the sort of sense-making and problem-solving cognition to help solve all the problems that we’re seemingly increasingly making for ourselves. And yeah this is reductionist and probably naive.