What I’m reacting to is more the “hot take” version that shows up in EA-adjacent podcasts — often as an analogy when people talk about AI policy: “look at nuclear, it got over-regulated and basically died, so don’t do that to AI.” In that context it’s not argued carefully, it’s just used as a rhetorical example, and (to me) it’s a pretty lossy / misleading compression of what’s going on.
I agree it’s a bit lossy and sometimes reflexive (this is what I meant with relying on libertarian priors), but I am still confused about your argument.
Because the argument you criticize is an historical one (“nuclear over regulation killed nuclear”) which is different from “now we need many steps and there are different strategies to make nuclear more competitive again”.
I think it is basically correct that over-regulation played a huge part in making nuclear uncompetitive and I don’t think that Isabelle or others knowing the history of nuclear energy would disagree with that, even if it might be a bit overglossed / stylized (obviously, it is not the only thing).
Ah, now I see—thanks for clarifying. Yes historically I do not know how much each set-back to nuclear mattered. I can see that e.g. constantly changing regulation, for example during builds (which I think Isabelle actually mentioned) could cause a significant hurdle for continuing build-out. Here I would defer to other experts like you and Isabelle.
Porting this over to “we might over regulate AI too”, I am realizing it is actually unclear to me whether people who use the “nuclear is over regulated” example means the literal same “historical” thing could happen to AI: --- We put in place constantly changing regulation on AI —This causes large training runs to be stopped mid way —In the end this makes AI uncompetitive with humans and the AI never really takes off —Removing regulation is not able to kick start the industry, as talent has left and we no longer know how to build large language models cost effectively
Writing this I still think I stand by my point that there are much better examples in terms of regulation holding progress back (speeding up vaccine development actually being such an EA cause area, human challenge trials etc.). I can lay out the arguments for why this is so if helpful. But it is basically something like “there is probably much more path dependency in nuclear compared to AI or pharma”.
I agree it’s a bit lossy and sometimes reflexive (this is what I meant with relying on libertarian priors), but I am still confused about your argument.
Because the argument you criticize is an historical one (“nuclear over regulation killed nuclear”) which is different from “now we need many steps and there are different strategies to make nuclear more competitive again”.
I think it is basically correct that over-regulation played a huge part in making nuclear uncompetitive and I don’t think that Isabelle or others knowing the history of nuclear energy would disagree with that, even if it might be a bit overglossed / stylized (obviously, it is not the only thing).
Ah, now I see—thanks for clarifying. Yes historically I do not know how much each set-back to nuclear mattered. I can see that e.g. constantly changing regulation, for example during builds (which I think Isabelle actually mentioned) could cause a significant hurdle for continuing build-out. Here I would defer to other experts like you and Isabelle.
Porting this over to “we might over regulate AI too”, I am realizing it is actually unclear to me whether people who use the “nuclear is over regulated” example means the literal same “historical” thing could happen to AI:
--- We put in place constantly changing regulation on AI
—This causes large training runs to be stopped mid way
—In the end this makes AI uncompetitive with humans and the AI never really takes off
—Removing regulation is not able to kick start the industry, as talent has left and we no longer know how to build large language models cost effectively
Writing this I still think I stand by my point that there are much better examples in terms of regulation holding progress back (speeding up vaccine development actually being such an EA cause area, human challenge trials etc.). I can lay out the arguments for why this is so if helpful. But it is basically something like “there is probably much more path dependency in nuclear compared to AI or pharma”.