Like you, I would prefer governments to take an increasing role, and hopefully even a dominant one.
I find it hard to imagine how this would happen. Over the last 50 years, I think (not super high confidence) the movement in the Western world at least has been. through neoliberalism and other forces (in broad strokes) away from government control and towards private management and control. This includes areas such as...
Healthcare
Financial markets
Power generation and distribution
In addition to this, government ambition both in terms of projects and new laws has I think reduced in the last 50 years. For example things like the Manhattan project, large public transport infrastructure projects and Power generation initiatives (nuclear, dams etc.) have dried up rather than increased.
What makes you think that government will a) Choose to take control b) Be able to take control.
I think its likely that there will be far more regulatory and taxation laws around AI in the next few years, but taking a “dominant role in the development of AI” is a whole different story. Wouldn’t that mean something like launching whole ‘AI departments’ as part of the public service, and making really ambitious laws to hamstring private players? Also the markets right now seem to think this unlikely if AI company valuations are anything to go on.
I might have missed an article/articles discussing why people think the government might actually spend the money and political capital to do this.
I don’t find it hard to imagine how this would happen. I find Linch’s claim interesting and would find an elaboration useful. I don’t thereby imply that the claim is unlikely to be true.
Like you, I would prefer governments to take an increasing role, and hopefully even a dominant one.
I find it hard to imagine how this would happen. Over the last 50 years, I think (not super high confidence) the movement in the Western world at least has been. through neoliberalism and other forces (in broad strokes) away from government control and towards private management and control. This includes areas such as...
Healthcare
Financial markets
Power generation and distribution
In addition to this, government ambition both in terms of projects and new laws has I think reduced in the last 50 years. For example things like the Manhattan project, large public transport infrastructure projects and Power generation initiatives (nuclear, dams etc.) have dried up rather than increased.
What makes you think that government will
a) Choose to take control
b) Be able to take control.
I think its likely that there will be far more regulatory and taxation laws around AI in the next few years, but taking a “dominant role in the development of AI” is a whole different story. Wouldn’t that mean something like launching whole ‘AI departments’ as part of the public service, and making really ambitious laws to hamstring private players? Also the markets right now seem to think this unlikely if AI company valuations are anything to go on.
I might have missed an article/articles discussing why people think the government might actually spend the money and political capital to do this.
Nice one.
I don’t find it hard to imagine how this would happen. I find Linch’s claim interesting and would find an elaboration useful. I don’t thereby imply that the claim is unlikely to be true.
Apologies will fix that and remove your name. Was just trying to credit you with triggering the thought.
Thanks, no worries.