I also think that whether or not the government regulates private AI has little to do with whether it militarizes AI. It’s not like there is one dial with “amount of government” and it just gets turned up or down. Government can do very little to restrict what Open AI/DeepMind/Anthropic do, but then also spend lots and lots of money on military AI projects. So worries about militarization are not really a reason not to want the government to restrict Open AI/DeepMind/Anthropic.
Not to mention that insofar as the basic science here is getting done for commercial reasons, any regulations which slow down the commercial development of frontier modes will actually slow down the progress of AI for military applications too, whether or not that is what the US gov intends, and regardless of whether those regulations are intended to reduce X-risk, or to protect the jobs of voice actors in cartoons facing AI replacement.
I think this is too pessimistic: why did one of Biden’s cabinet ask for Christiano in one of the top positions at the US gov’s AI safety org if the government will reliably prioritize the sort of factors you cite here to the exclusion of safety?: https://www.nist.gov/news-events/news/2024/04/us-commerce-secretary-gina-raimondo-announces-expansion-us-ai-safety
I also think that whether or not the government regulates private AI has little to do with whether it militarizes AI. It’s not like there is one dial with “amount of government” and it just gets turned up or down. Government can do very little to restrict what Open AI/DeepMind/Anthropic do, but then also spend lots and lots of money on military AI projects. So worries about militarization are not really a reason not to want the government to restrict Open AI/DeepMind/Anthropic.
Not to mention that insofar as the basic science here is getting done for commercial reasons, any regulations which slow down the commercial development of frontier modes will actually slow down the progress of AI for military applications too, whether or not that is what the US gov intends, and regardless of whether those regulations are intended to reduce X-risk, or to protect the jobs of voice actors in cartoons facing AI replacement.