Ah, now I see—thanks for clarifying. Yes historically I do not know how much each set-back to nuclear mattered. I can see that e.g. constantly changing regulation, for example during builds (which I think Isabelle actually mentioned) could cause a significant hurdle for continuing build-out. Here I would defer to other experts like you and Isabelle.
Porting this over to “we might over regulate AI too”, I am realizing it is actually unclear to me whether people who use the “nuclear is over regulated” example means the literal same “historical” thing could happen to AI: --- We put in place constantly changing regulation on AI —This causes large training runs to be stopped mid way —In the end this makes AI uncompetitive with humans and the AI never really takes off —Removing regulation is not able to kick start the industry, as talent has left and we no longer know how to build large language models cost effectively
Writing this I still think I stand by my point that there are much better examples in terms of regulation holding progress back (speeding up vaccine development actually being such an EA cause area, human challenge trials etc.). I can lay out the arguments for why this is so if helpful. But it is basically something like “there is probably much more path dependency in nuclear compared to AI or pharma”.
Ah, now I see—thanks for clarifying. Yes historically I do not know how much each set-back to nuclear mattered. I can see that e.g. constantly changing regulation, for example during builds (which I think Isabelle actually mentioned) could cause a significant hurdle for continuing build-out. Here I would defer to other experts like you and Isabelle.
Porting this over to “we might over regulate AI too”, I am realizing it is actually unclear to me whether people who use the “nuclear is over regulated” example means the literal same “historical” thing could happen to AI:
--- We put in place constantly changing regulation on AI
—This causes large training runs to be stopped mid way
—In the end this makes AI uncompetitive with humans and the AI never really takes off
—Removing regulation is not able to kick start the industry, as talent has left and we no longer know how to build large language models cost effectively
Writing this I still think I stand by my point that there are much better examples in terms of regulation holding progress back (speeding up vaccine development actually being such an EA cause area, human challenge trials etc.). I can lay out the arguments for why this is so if helpful. But it is basically something like “there is probably much more path dependency in nuclear compared to AI or pharma”.