Out of curiosity: Where have EAs argued that “nuclear is overregulated” and, more specifically, where have EAs argued that over-regulation is the only or dominant driver of the cost problem?
It’s probably true that this sometimes happens—especially when EAs outside of climate/energy point to “nuclear is overregulated” as something in line with libertarian / abundance-y priors—but I think those in EA that have done work on nuclear would not subscribe to or spread the view that regulation is the only driver of nuclear problem.
That said, it seems clearly true—and I do think Isabelle agrees with that—that regulatory reform is a necessary component of making nuclear in the West buildable at scale again (alongside many other factors, such as sustained political will, technological progress, re-established supply chains, valuing clean firm power for its attributes, etc).
Good question. I agree: people in EA who’ve actually worked on nuclear don’t usually claim over-regulation is the only or even dominant driver of the cost/buildout problem.
What I’m reacting to is more the “hot take” version that shows up in EA-adjacent podcasts — often as an analogy when people talk about AI policy: “look at nuclear, it got over-regulated and basically died, so don’t do that to AI.” In that context it’s not argued carefully, it’s just used as a rhetorical example, and (to me) it’s a pretty lossy / misleading compression of what’s going on.
So I’m not trying to call out serious nuclear work in EA — I’m mostly sharing the Clearer Thinking episode as a good “orientation reset” because it keeps pointing back to what the binding constraints plausibly are, with regulation as one (maybe not even the main) piece of a complex situation.
Also possible I’m misremembering some of the specific instances — I haven’t kept notes — but I’ve heard the framing enough that it started to rub me the wrong way.
And I’m genuinely curious where you land on the “regulatory reform is necessary” point: do you think the key thing is removing regulation, changing it, or adding policy/market design (e.g. electricity market reform / stable revenue mechanisms / valuing clean firm power)? I’m currently leaning toward “markets/revenue model is the real lever”, but I’m not confident.
One thing I loved reading was a model of Sweden’s total system cost with vs without nuclear (incl. stuff like transmission build-out). It suggested fairly similar overall cost in both worlds — but the nuclear-heavy system leaned more on established tech (less batteries, etc., and I don’t remember if demand response was included).
My read is that the real challenge is: even if total system costs are comparable, how do you actually allocate those costs and rewards in something resembling a market so the “good” system gets built? (Unless you go much more “total state-owned super regulated” and basically nationalising the whole thing.)
What I’m reacting to is more the “hot take” version that shows up in EA-adjacent podcasts — often as an analogy when people talk about AI policy: “look at nuclear, it got over-regulated and basically died, so don’t do that to AI.” In that context it’s not argued carefully, it’s just used as a rhetorical example, and (to me) it’s a pretty lossy / misleading compression of what’s going on.
I agree it’s a bit lossy and sometimes reflexive (this is what I meant with relying on libertarian priors), but I am still confused about your argument.
Because the argument you criticize is an historical one (“nuclear over regulation killed nuclear”) which is different from “now we need many steps and there are different strategies to make nuclear more competitive again”.
I think it is basically correct that over-regulation played a huge part in making nuclear uncompetitive and I don’t think that Isabelle or others knowing the history of nuclear energy would disagree with that, even if it might be a bit overglossed / stylized (obviously, it is not the only thing).
Ah, now I see—thanks for clarifying. Yes historically I do not know how much each set-back to nuclear mattered. I can see that e.g. constantly changing regulation, for example during builds (which I think Isabelle actually mentioned) could cause a significant hurdle for continuing build-out. Here I would defer to other experts like you and Isabelle.
Porting this over to “we might over regulate AI too”, I am realizing it is actually unclear to me whether people who use the “nuclear is over regulated” example means the literal same “historical” thing could happen to AI: --- We put in place constantly changing regulation on AI —This causes large training runs to be stopped mid way —In the end this makes AI uncompetitive with humans and the AI never really takes off —Removing regulation is not able to kick start the industry, as talent has left and we no longer know how to build large language models cost effectively
Writing this I still think I stand by my point that there are much better examples in terms of regulation holding progress back (speeding up vaccine development actually being such an EA cause area, human challenge trials etc.). I can lay out the arguments for why this is so if helpful. But it is basically something like “there is probably much more path dependency in nuclear compared to AI or pharma”.
Out of curiosity: Where have EAs argued that “nuclear is overregulated” and, more specifically, where have EAs argued that over-regulation is the only or dominant driver of the cost problem?
It’s probably true that this sometimes happens—especially when EAs outside of climate/energy point to “nuclear is overregulated” as something in line with libertarian / abundance-y priors—but I think those in EA that have done work on nuclear would not subscribe to or spread the view that regulation is the only driver of nuclear problem.
That said, it seems clearly true—and I do think Isabelle agrees with that—that regulatory reform is a necessary component of making nuclear in the West buildable at scale again (alongside many other factors, such as sustained political will, technological progress, re-established supply chains, valuing clean firm power for its attributes, etc).
Good question. I agree: people in EA who’ve actually worked on nuclear don’t usually claim over-regulation is the only or even dominant driver of the cost/buildout problem.
What I’m reacting to is more the “hot take” version that shows up in EA-adjacent podcasts — often as an analogy when people talk about AI policy: “look at nuclear, it got over-regulated and basically died, so don’t do that to AI.” In that context it’s not argued carefully, it’s just used as a rhetorical example, and (to me) it’s a pretty lossy / misleading compression of what’s going on.
So I’m not trying to call out serious nuclear work in EA — I’m mostly sharing the Clearer Thinking episode as a good “orientation reset” because it keeps pointing back to what the binding constraints plausibly are, with regulation as one (maybe not even the main) piece of a complex situation.
Also possible I’m misremembering some of the specific instances — I haven’t kept notes — but I’ve heard the framing enough that it started to rub me the wrong way.
And I’m genuinely curious where you land on the “regulatory reform is necessary” point: do you think the key thing is removing regulation, changing it, or adding policy/market design (e.g. electricity market reform / stable revenue mechanisms / valuing clean firm power)? I’m currently leaning toward “markets/revenue model is the real lever”, but I’m not confident.
One thing I loved reading was a model of Sweden’s total system cost with vs without nuclear (incl. stuff like transmission build-out). It suggested fairly similar overall cost in both worlds — but the nuclear-heavy system leaned more on established tech (less batteries, etc., and I don’t remember if demand response was included).
My read is that the real challenge is: even if total system costs are comparable, how do you actually allocate those costs and rewards in something resembling a market so the “good” system gets built? (Unless you go much more “total state-owned super regulated” and basically nationalising the whole thing.)
I agree it’s a bit lossy and sometimes reflexive (this is what I meant with relying on libertarian priors), but I am still confused about your argument.
Because the argument you criticize is an historical one (“nuclear over regulation killed nuclear”) which is different from “now we need many steps and there are different strategies to make nuclear more competitive again”.
I think it is basically correct that over-regulation played a huge part in making nuclear uncompetitive and I don’t think that Isabelle or others knowing the history of nuclear energy would disagree with that, even if it might be a bit overglossed / stylized (obviously, it is not the only thing).
Ah, now I see—thanks for clarifying. Yes historically I do not know how much each set-back to nuclear mattered. I can see that e.g. constantly changing regulation, for example during builds (which I think Isabelle actually mentioned) could cause a significant hurdle for continuing build-out. Here I would defer to other experts like you and Isabelle.
Porting this over to “we might over regulate AI too”, I am realizing it is actually unclear to me whether people who use the “nuclear is over regulated” example means the literal same “historical” thing could happen to AI:
--- We put in place constantly changing regulation on AI
—This causes large training runs to be stopped mid way
—In the end this makes AI uncompetitive with humans and the AI never really takes off
—Removing regulation is not able to kick start the industry, as talent has left and we no longer know how to build large language models cost effectively
Writing this I still think I stand by my point that there are much better examples in terms of regulation holding progress back (speeding up vaccine development actually being such an EA cause area, human challenge trials etc.). I can lay out the arguments for why this is so if helpful. But it is basically something like “there is probably much more path dependency in nuclear compared to AI or pharma”.