Hi Jason, thanks for writing this article. I appreciate your effort in digging into the topic beyond a surface level. I agree with you that the change from AEC to NRC/ERDA was mostly irrelevant to the level of regulation. And I love your way of explaining ALARA. Your analysis of cost drivers is also pretty convincing. Unfortunately, while your thesis is partially correct, some of your arguments are factually shaky. Please allow me to offer a few corrections and criticisms, in the spirit of trying to better understand the problems and possible solutions.
First a couple of minor things, not really important to the main argument:
- "The Automatic Depressurization System will open automatically in the event that the pressure in the reactor vessel is too high, and vent some gas outside."
The ADS isn't for pressure relief, it's for rapid emergency depressurization to near-atmospheric pressure so that you can inject cooling water with low-pressure pumps.
- "The Reactor Protection System is a piece of control software that will continuously monitor measurements and then automatically order the shutdown of the reactor if necessary."
The RPS is very much not a piece of control software, at least not at current power plants. It's a complicated physical system with arrays of pneumatic tubes and electrical buses and big mechanical relays and circuit breakers that will get stuck if you don't grease them properly. Relevant story: https://blog.ucsusa.org/dlochbaum/fission-stories-106-lightning-strikes-twice
More important is this paragraph:
- "Probabilistic Risk Assessment is the heart and soul of the NRC’s, and in fact the entire World’s, approach to regulating nuclear power plant safety. It controls all of the key decisions that really matter, and crucially, all of the decisions about what sort of expensive safety features are necessary."
I regret to inform you that this is wishful (okay let's say "aspirational") thinking. PRA is used now more than it used to be, and more in the US than most other places, but it's still basically treated as icing on the safety cake. You design a reactor to follow all the old prescriptive rules, and then you do a PRA, and if it turns out well then it might help you argue for an exemption to some specific rule. It's used in many cases to figure out how long you can take to fix a broken piece of safety equipment - but there's still a 30 day maximum, no matter what your risk calculation says.
The Regulatory Analysis Guidelines you pointed to are for cost-benefit analysis, which probably should be done for all NRC rules (most federal agencies are required to do this). But the NRC only has to do it for "backfit" rules that require changes to already-completed power plants. The rest of the time, it's optional.
Maybe these limitations will change over time. The idea that the NRC should be "risk-informed" was only coined in the late 1990s, though the actual process started a bit earlier. Progress has been bumpy. Part 53, the NRC's proposed set of rules for licensing advanced reactors, is a good example. It was originally a framework completely based on PRA. But some in the nuclear industry complained, because doing a PRA is pretty expensive, especially if you're just building a little micro-reactor. So then the NRC added a second option that uses prescriptive rules kind of like the old ones, but applicable to lots of types of reactor instead of just light water reactors. A lot of nuclear industry people don't like that one either, plus now the whole rule is 1300 pages long too. (Fingers crossed it will improve before it's finalized.)
But even if every regulation on nuclear power is someday based on rigorous cost-benefit analysis, that isn't the same as rationally pricing externalities. The marginal cost of increasing safety/decreasing pollution enough to prevent a death could be exactly the same for a coal plant and a nuclear plant, but the coal plant remains cheaper in most cases, largely because the pollution it is not paying for is so much more harmful. Unless we can get our act together and just charge for polluting, changes to NRC regulations are unlikely to fix the problem. A carbon tax is the obvious way, if politically unappealing. But taxing other types of air pollution would do a lot to help get rid of coal.
In other words, even if we are doing cost-benefit analysis to optimize trade-offs for each type of power plant, we are not optimizing the societal cost overall. Proper application of cost-benefit to a coal plant would show that you could dramatically reduce health effects at a modest increase in cost… by replacing it with something different. As far as I know, this is never required and there is no proposal to require it.
Some people have suggested that to even the playing field, we should loosen nuclear regulations until it is just as harmful as coal. Aside from the practical problems with this, you have elegantly explained why it would be fundamentally dumb.
- "I’d also like to reiterate that the amount of money spent on controlling low-level radiation is a very very small fraction of the cost of producing nuclear power. The normal, non-emergency operation of nuclear reactors pose a real but small threat to employees and the public. Yes, the government has a regulatory framework that says that that threat has to be addressed within a rubric of cost-benefit analysis. The amount of money spent on that threat is small for precisely the reason that the threat is small to begin with, and it’s cheap to contain."
Low-level radiation comes up in two different ways. One is normal operation, as you mention, where most exposure is to the employees. The other is after a severe accident spreads radioactive material over a wide area. In most such accidents, large doses can be easily avoiding by evacuating. But for people to return to their homes a few weeks later, or continue farming the land, etc, our society has to be willing to tolerate some small ongoing radiation dose that might (or might not) increase their risk of cancer. In theory this is just another application of ALARA, but I think people kind of confuse it with LNT.
- "The NRC doesn’t demand any design changes during construction."
This is sort of true, now, but it wasn't always. Vogtle 3 and 4 were licensed under Part 52, which means they got a combined Construction and Operating License before they started construction. Their design was supposed to be finalized at that point. When they ended up making changes, it added to the cost, partly because they had to get their license amended for many of those changes and that caused delays. Also some other things had to be redone was because they didn't meet the very strict quality standards required by the design they licensed. I think it's fair to blame that stuff on the builders, not the NRC, but it's still ultimately a cost of regulation.
But Part 52 was introduced in 1989. Most nuclear plants in the US were built under the requirements of Part 50, where they would get a construction permit based on a preliminary design, and then once it was finished they'd go back for an operating license. That meant no expensive amendments during the construction process. But it also meant that in order to get the operating license, they had to comply with rule changes made while the reactor was under construction. In the 1970s, when many of those reactors were under construction, there were lots of rule changes, some related to TMI and some not. You see the problem.
These days Part 50 is still an option - the NRC just recently issued a construction permit under Part 50 for the Kairos Hermes test reactor. It's a little safer now because the rules have mostly stabilized, and I think a lot of companies building first-of-a-kind reactors actually should be more worried about the problems of Vogtle 3 than the problems of reactors in the late 1970s, so applying for a construction permit instead of a combined license could make sense.
You mentioned backfits, but it's worth noting that they don't always have to be justified with a cost-benefit analysis (there's an exception if the change is required for "adequate protection" of public health and safety, whatever that means). And they certainly didn't stop shortly after TMI. Substantial backfits were required in the 1980s (the ATWS rule), after 9/11, and probably the most costly set was after Fukushima, when all plants were required to reevaluate seismic and flood risks and a bunch of BWRs were required to install hardened vent systems.
I think the appropriate conclusion is neither "nuclear only failed because of regulation" nor "the problems are unrelated to regulation". There's a big difference between how competitive nuclear power would be tomorrow if we relaxed regulation today, and how competitive it would be today if those regulations had been relaxed from 1960 on. The latter could be a huge difference, but the specifics are very difficult to guess. The former is just probably not very dramatic. The truth is that any change in regulations now will take many years, if not many decades, to filter through to its maximum effect on the US energy landscape.
Hey man! Thanks so much for responding, and sorry for the late reply.
You make a whole bunch of good points, many of your corrections are probably accurate.
But in particular, I'd be interested in this: What's the biggest single example of a change that had to take place during construction back when they had separate construction/operating permits? Because my strong suspicion is that although there were some changes like that, they were just orders of magnitude smaller than anything that could plausibly be blamed for the sorts of cost overruns that those construction projects saw. But I'd absolutely be interested in, what's the biggest example of a change that had to be made during construction because of this?
But here's the big picture: I'm not sure what we actually disagree on! In 1980, I can really only imagine two policies that would have saved the nuclear industry. First, you could have put a huge punitive tax on carbon burning power. Second, you could have had a huge public investment campaign- essentially take the TVA national.
Now, I would have been in favor of either of those two things, but I simply cannot imagine either of them plausibly being passed at the time. I mean, Ronald Reagan wanted to sell the TVA. I just can't plausibly see him initiating what would have been the largest public investment campaigns in American history. And neither can I imagine the politics of the era supporting a huge carbon tax. And those are really the only two policies that I can imagine saving the nuclear industry.
The specific claim that's been made by many people is that Nuclear Power failed *because of NRC overregulation.* And I'm here to say that's just not true. You could absolutely say that nuclear power failed because of bad policy, but it's all sins of omission, not commission. That's the point.
I am really quite skeptical that you could ever make nuclear power competitive by reducing the social cost of pollution. Even if we drastically slashed the value, from $10M, down to like $1M or something, my general sense is that just wouldn't be enough to overcome the economic body blows that the nuclear industry was getting from all different directions. I'd have to see it to believe it.
So that's really my position. Nuclear power absolutely could have been saved by a large public investment campaign or a carbon tax, but the specific claim being made is that the problem is NRC overregulation. That's what it would have taken to save in the 1970s. But these days, now that we have cheaper solar and wind anyway... I actually really just don't see the point.
It has an excellent discussion of many expensive changes in regulation, but leads off with the comment that the costs are usually not quantifiable. But as I was reading, I realized there was actually one big change I could put a number on. Shoreham Nuclear Power Plant began construction in 1972. Like many other plants, it experienced delays and cost overruns for various reasons, including public opposition. By the time it was completed in 1984, the NRC had created new post-TMI requirements for evacuation planning in coordination with the state. This allowed the governor of New York to simply refuse to coordinate in any way. The plant was never operated. Therefore, the cost of the rule change was, in some sense, the entire construction cost of several billion dollars.
Specific examples aside, I do think we mostly agree. The economic conditions probably had more effect than the regulations, and by 1980 it certainly would have been too late to save the nuclear momentum without something dramatic. If better regulations had been in place from the start, and stable, then maybe all those over-budget plants would have been faster and cheaper and a lot more would have been finished before the economy soured. Maybe TMI never would have happened. Maybe the public would have been more willing to believe nuclear was safe. And maybe none of that would have been enough to make a difference.
And I know I already wrote half a book there, but two more things:
- "My challenge to the NRC skeptics is simply to tell me what, exactly, the NRC is requiring that they shouldn’t be requiring."
I don't have a solid answer to this, but one pet peeve I can point to is the Quality Assurance requirements in 10 CFR Part 50 Appendix B. This appendix is written in broad/vague language, and everything in it honestly sounds pretty reasonable. But the way it is implemented (which is endorsed, and effectively required, by the NRC) is via the industry standard NQA-1, which is often pointed out as a major source of costs to manufacture nuclear-grade components. This is the source of the (perhaps apocryphal) quote that "when the paperwork weighs as much as the reactor vessel, it's ready to ship." Every step of manufacturing and testing must be documented in great detail. As far as I know, there has never been a credible attempt to estimate what effect these QA requirements have on risk. I'm sure it's true that an NQA-1 pipe is less likely to spontaneously rupture than a commercial grade pipe. But how much less? Is it worth it? Does anyone know? And if not, why is it required? Maybe this analysis exists somewhere, but I've looked around and haven't found it.
- "If he really believes that, that means that the NRC refused to approve of the NuScale application until they added all sorts of silly expensive safety features that only a fool would think are necessary, right? Can he tell me what they are?"
The NRC nitpicked a lot of things in that design; I wouldn't say "only a fool" would think they're necessary, but you don't have to be a fool to be wrong or just overly cautious. One that I heard about in particular was the boron injection system. This is designed to serve as a backup shutdown mechanism (boron is a potent neutron absorber) in case the control rods are not sufficient to stop the nuclear reaction, for instance if not all of them go in. The boron is dissolved in water, and you inject the water into the reactor to shut it down. Pretty standard stuff. Now, during a prolonged loss of coolant accident you could gradually boil off a lot of that water. The steam rises to the top, where it condenses, and then it falls to the bottom of the containment vessel. Once enough of it builds up, it can flow back into the vessel to replenish the cooling water and ensure the reactor stays covered. The problem is, that condensed steam has essentially been distilled--it doesn't bring the boron with it. As long as it goes in slowly, it should mix with the borated water already in the vessel, and it's fine. But the NRC theorized that if something caused a big slug of cold unborated water to come in all at once (I forget how this was supposed to happen), it could displace the borated water already in there rather than mixing. Then the reactor would be covered in cold water, with no boron. Most of the time you'd still be fine, because built-up xenon-135 will still act as a neutron poison; but most of that xenon decays away after about 24-48 hours. So if your slug of clean water comes in at the 48 hour mark, and the reactor has been running for a couple years already so it was about to need refueling, then calculations showed that it might just be sufficient to cause the reactor to start up again. Of course at that point, the water would heat up and the rise in temperature would make it shut down, and probably pretty quickly the boron would mix back in. But still, temporarily there would be a possibility of a recriticality, which is bad. This got argued about back and forth for months, and the NRC staff's safety review of the design certification application mentioned it as a caveat to the general conclusion that NuScale would meet all safety requirements. Eventually, the commission approved the design cert despite this and a couple of other caveats:
In the next version (not yet approved), NuScale added a "basket" in the containment vessel that would contain blocks of slow-dissolving boron to eliminate the problem. I don't know how much cost this adds, or how much the arguments delayed licensing. In NuScale's case it wouldn't be the straw that breaks the camel's back; their design was already kind of a money pit. But I do think it's a case of regulatory burden with little to no safety benefit. Fifty years ago, when the industry hit tough macroeconomic conditions, was this kind of thing (plus NEPA and so forth) enough to push them over the edge and strangle the expansion of nuclear power? I don't know for sure, but it doesn't seem too farfetched.
This is an amazing article, Jason. Despite the complicated topic (with all sorts of statistics and physics), you wrote in a way that makes it easy to understand and a joy to read. I learned an incredible amount about a very interesting subject. Thanks for the effort it took to do this, it was well worth it.
One question: As around 80% of electricity in France is from nuclear power, what's your prediction for their future in generating electricity?
Hi Jason, thanks for writing this article. I appreciate your effort in digging into the topic beyond a surface level. I agree with you that the change from AEC to NRC/ERDA was mostly irrelevant to the level of regulation. And I love your way of explaining ALARA. Your analysis of cost drivers is also pretty convincing. Unfortunately, while your thesis is partially correct, some of your arguments are factually shaky. Please allow me to offer a few corrections and criticisms, in the spirit of trying to better understand the problems and possible solutions.
First a couple of minor things, not really important to the main argument:
- "The Automatic Depressurization System will open automatically in the event that the pressure in the reactor vessel is too high, and vent some gas outside."
The ADS isn't for pressure relief, it's for rapid emergency depressurization to near-atmospheric pressure so that you can inject cooling water with low-pressure pumps.
- "The Reactor Protection System is a piece of control software that will continuously monitor measurements and then automatically order the shutdown of the reactor if necessary."
The RPS is very much not a piece of control software, at least not at current power plants. It's a complicated physical system with arrays of pneumatic tubes and electrical buses and big mechanical relays and circuit breakers that will get stuck if you don't grease them properly. Relevant story: https://blog.ucsusa.org/dlochbaum/fission-stories-106-lightning-strikes-twice
More important is this paragraph:
- "Probabilistic Risk Assessment is the heart and soul of the NRC’s, and in fact the entire World’s, approach to regulating nuclear power plant safety. It controls all of the key decisions that really matter, and crucially, all of the decisions about what sort of expensive safety features are necessary."
I regret to inform you that this is wishful (okay let's say "aspirational") thinking. PRA is used now more than it used to be, and more in the US than most other places, but it's still basically treated as icing on the safety cake. You design a reactor to follow all the old prescriptive rules, and then you do a PRA, and if it turns out well then it might help you argue for an exemption to some specific rule. It's used in many cases to figure out how long you can take to fix a broken piece of safety equipment - but there's still a 30 day maximum, no matter what your risk calculation says.
The Regulatory Analysis Guidelines you pointed to are for cost-benefit analysis, which probably should be done for all NRC rules (most federal agencies are required to do this). But the NRC only has to do it for "backfit" rules that require changes to already-completed power plants. The rest of the time, it's optional.
Maybe these limitations will change over time. The idea that the NRC should be "risk-informed" was only coined in the late 1990s, though the actual process started a bit earlier. Progress has been bumpy. Part 53, the NRC's proposed set of rules for licensing advanced reactors, is a good example. It was originally a framework completely based on PRA. But some in the nuclear industry complained, because doing a PRA is pretty expensive, especially if you're just building a little micro-reactor. So then the NRC added a second option that uses prescriptive rules kind of like the old ones, but applicable to lots of types of reactor instead of just light water reactors. A lot of nuclear industry people don't like that one either, plus now the whole rule is 1300 pages long too. (Fingers crossed it will improve before it's finalized.)
But even if every regulation on nuclear power is someday based on rigorous cost-benefit analysis, that isn't the same as rationally pricing externalities. The marginal cost of increasing safety/decreasing pollution enough to prevent a death could be exactly the same for a coal plant and a nuclear plant, but the coal plant remains cheaper in most cases, largely because the pollution it is not paying for is so much more harmful. Unless we can get our act together and just charge for polluting, changes to NRC regulations are unlikely to fix the problem. A carbon tax is the obvious way, if politically unappealing. But taxing other types of air pollution would do a lot to help get rid of coal.
In other words, even if we are doing cost-benefit analysis to optimize trade-offs for each type of power plant, we are not optimizing the societal cost overall. Proper application of cost-benefit to a coal plant would show that you could dramatically reduce health effects at a modest increase in cost… by replacing it with something different. As far as I know, this is never required and there is no proposal to require it.
Some people have suggested that to even the playing field, we should loosen nuclear regulations until it is just as harmful as coal. Aside from the practical problems with this, you have elegantly explained why it would be fundamentally dumb.
- "I’d also like to reiterate that the amount of money spent on controlling low-level radiation is a very very small fraction of the cost of producing nuclear power. The normal, non-emergency operation of nuclear reactors pose a real but small threat to employees and the public. Yes, the government has a regulatory framework that says that that threat has to be addressed within a rubric of cost-benefit analysis. The amount of money spent on that threat is small for precisely the reason that the threat is small to begin with, and it’s cheap to contain."
Low-level radiation comes up in two different ways. One is normal operation, as you mention, where most exposure is to the employees. The other is after a severe accident spreads radioactive material over a wide area. In most such accidents, large doses can be easily avoiding by evacuating. But for people to return to their homes a few weeks later, or continue farming the land, etc, our society has to be willing to tolerate some small ongoing radiation dose that might (or might not) increase their risk of cancer. In theory this is just another application of ALARA, but I think people kind of confuse it with LNT.
- "The NRC doesn’t demand any design changes during construction."
This is sort of true, now, but it wasn't always. Vogtle 3 and 4 were licensed under Part 52, which means they got a combined Construction and Operating License before they started construction. Their design was supposed to be finalized at that point. When they ended up making changes, it added to the cost, partly because they had to get their license amended for many of those changes and that caused delays. Also some other things had to be redone was because they didn't meet the very strict quality standards required by the design they licensed. I think it's fair to blame that stuff on the builders, not the NRC, but it's still ultimately a cost of regulation.
But Part 52 was introduced in 1989. Most nuclear plants in the US were built under the requirements of Part 50, where they would get a construction permit based on a preliminary design, and then once it was finished they'd go back for an operating license. That meant no expensive amendments during the construction process. But it also meant that in order to get the operating license, they had to comply with rule changes made while the reactor was under construction. In the 1970s, when many of those reactors were under construction, there were lots of rule changes, some related to TMI and some not. You see the problem.
These days Part 50 is still an option - the NRC just recently issued a construction permit under Part 50 for the Kairos Hermes test reactor. It's a little safer now because the rules have mostly stabilized, and I think a lot of companies building first-of-a-kind reactors actually should be more worried about the problems of Vogtle 3 than the problems of reactors in the late 1970s, so applying for a construction permit instead of a combined license could make sense.
You mentioned backfits, but it's worth noting that they don't always have to be justified with a cost-benefit analysis (there's an exception if the change is required for "adequate protection" of public health and safety, whatever that means). And they certainly didn't stop shortly after TMI. Substantial backfits were required in the 1980s (the ATWS rule), after 9/11, and probably the most costly set was after Fukushima, when all plants were required to reevaluate seismic and flood risks and a bunch of BWRs were required to install hardened vent systems.
I think the appropriate conclusion is neither "nuclear only failed because of regulation" nor "the problems are unrelated to regulation". There's a big difference between how competitive nuclear power would be tomorrow if we relaxed regulation today, and how competitive it would be today if those regulations had been relaxed from 1960 on. The latter could be a huge difference, but the specifics are very difficult to guess. The former is just probably not very dramatic. The truth is that any change in regulations now will take many years, if not many decades, to filter through to its maximum effect on the US energy landscape.
Hey man! Thanks so much for responding, and sorry for the late reply.
You make a whole bunch of good points, many of your corrections are probably accurate.
But in particular, I'd be interested in this: What's the biggest single example of a change that had to take place during construction back when they had separate construction/operating permits? Because my strong suspicion is that although there were some changes like that, they were just orders of magnitude smaller than anything that could plausibly be blamed for the sorts of cost overruns that those construction projects saw. But I'd absolutely be interested in, what's the biggest example of a change that had to be made during construction because of this?
But here's the big picture: I'm not sure what we actually disagree on! In 1980, I can really only imagine two policies that would have saved the nuclear industry. First, you could have put a huge punitive tax on carbon burning power. Second, you could have had a huge public investment campaign- essentially take the TVA national.
Now, I would have been in favor of either of those two things, but I simply cannot imagine either of them plausibly being passed at the time. I mean, Ronald Reagan wanted to sell the TVA. I just can't plausibly see him initiating what would have been the largest public investment campaigns in American history. And neither can I imagine the politics of the era supporting a huge carbon tax. And those are really the only two policies that I can imagine saving the nuclear industry.
The specific claim that's been made by many people is that Nuclear Power failed *because of NRC overregulation.* And I'm here to say that's just not true. You could absolutely say that nuclear power failed because of bad policy, but it's all sins of omission, not commission. That's the point.
I am really quite skeptical that you could ever make nuclear power competitive by reducing the social cost of pollution. Even if we drastically slashed the value, from $10M, down to like $1M or something, my general sense is that just wouldn't be enough to overcome the economic body blows that the nuclear industry was getting from all different directions. I'd have to see it to believe it.
So that's really my position. Nuclear power absolutely could have been saved by a large public investment campaign or a carbon tax, but the specific claim being made is that the problem is NRC overregulation. That's what it would have taken to save in the 1970s. But these days, now that we have cheaper solar and wind anyway... I actually really just don't see the point.
Again, thanks so much for your reply!
Jason, you asked an excellent question here, and I spent quite some time looking for a good answer. I eventually found this book from 1981: https://www.komanoff.net/nuclear_power/Power_Plant_Cost_Escalation.pdf
It has an excellent discussion of many expensive changes in regulation, but leads off with the comment that the costs are usually not quantifiable. But as I was reading, I realized there was actually one big change I could put a number on. Shoreham Nuclear Power Plant began construction in 1972. Like many other plants, it experienced delays and cost overruns for various reasons, including public opposition. By the time it was completed in 1984, the NRC had created new post-TMI requirements for evacuation planning in coordination with the state. This allowed the governor of New York to simply refuse to coordinate in any way. The plant was never operated. Therefore, the cost of the rule change was, in some sense, the entire construction cost of several billion dollars.
Specific examples aside, I do think we mostly agree. The economic conditions probably had more effect than the regulations, and by 1980 it certainly would have been too late to save the nuclear momentum without something dramatic. If better regulations had been in place from the start, and stable, then maybe all those over-budget plants would have been faster and cheaper and a lot more would have been finished before the economy soured. Maybe TMI never would have happened. Maybe the public would have been more willing to believe nuclear was safe. And maybe none of that would have been enough to make a difference.
And I know I already wrote half a book there, but two more things:
- "My challenge to the NRC skeptics is simply to tell me what, exactly, the NRC is requiring that they shouldn’t be requiring."
I don't have a solid answer to this, but one pet peeve I can point to is the Quality Assurance requirements in 10 CFR Part 50 Appendix B. This appendix is written in broad/vague language, and everything in it honestly sounds pretty reasonable. But the way it is implemented (which is endorsed, and effectively required, by the NRC) is via the industry standard NQA-1, which is often pointed out as a major source of costs to manufacture nuclear-grade components. This is the source of the (perhaps apocryphal) quote that "when the paperwork weighs as much as the reactor vessel, it's ready to ship." Every step of manufacturing and testing must be documented in great detail. As far as I know, there has never been a credible attempt to estimate what effect these QA requirements have on risk. I'm sure it's true that an NQA-1 pipe is less likely to spontaneously rupture than a commercial grade pipe. But how much less? Is it worth it? Does anyone know? And if not, why is it required? Maybe this analysis exists somewhere, but I've looked around and haven't found it.
- "If he really believes that, that means that the NRC refused to approve of the NuScale application until they added all sorts of silly expensive safety features that only a fool would think are necessary, right? Can he tell me what they are?"
The NRC nitpicked a lot of things in that design; I wouldn't say "only a fool" would think they're necessary, but you don't have to be a fool to be wrong or just overly cautious. One that I heard about in particular was the boron injection system. This is designed to serve as a backup shutdown mechanism (boron is a potent neutron absorber) in case the control rods are not sufficient to stop the nuclear reaction, for instance if not all of them go in. The boron is dissolved in water, and you inject the water into the reactor to shut it down. Pretty standard stuff. Now, during a prolonged loss of coolant accident you could gradually boil off a lot of that water. The steam rises to the top, where it condenses, and then it falls to the bottom of the containment vessel. Once enough of it builds up, it can flow back into the vessel to replenish the cooling water and ensure the reactor stays covered. The problem is, that condensed steam has essentially been distilled--it doesn't bring the boron with it. As long as it goes in slowly, it should mix with the borated water already in the vessel, and it's fine. But the NRC theorized that if something caused a big slug of cold unborated water to come in all at once (I forget how this was supposed to happen), it could displace the borated water already in there rather than mixing. Then the reactor would be covered in cold water, with no boron. Most of the time you'd still be fine, because built-up xenon-135 will still act as a neutron poison; but most of that xenon decays away after about 24-48 hours. So if your slug of clean water comes in at the 48 hour mark, and the reactor has been running for a couple years already so it was about to need refueling, then calculations showed that it might just be sufficient to cause the reactor to start up again. Of course at that point, the water would heat up and the rise in temperature would make it shut down, and probably pretty quickly the boron would mix back in. But still, temporarily there would be a possibility of a recriticality, which is bad. This got argued about back and forth for months, and the NRC staff's safety review of the design certification application mentioned it as a caveat to the general conclusion that NuScale would meet all safety requirements. Eventually, the commission approved the design cert despite this and a couple of other caveats:
https://www.federalregister.gov/documents/2023/01/19/2023-00729/nuscale-small-modular-reactor-design-certification#h-28
In the next version (not yet approved), NuScale added a "basket" in the containment vessel that would contain blocks of slow-dissolving boron to eliminate the problem. I don't know how much cost this adds, or how much the arguments delayed licensing. In NuScale's case it wouldn't be the straw that breaks the camel's back; their design was already kind of a money pit. But I do think it's a case of regulatory burden with little to no safety benefit. Fifty years ago, when the industry hit tough macroeconomic conditions, was this kind of thing (plus NEPA and so forth) enough to push them over the edge and strangle the expansion of nuclear power? I don't know for sure, but it doesn't seem too farfetched.
Great article, I have some questions we can discuss next time we talk.
This is an amazing article, Jason. Despite the complicated topic (with all sorts of statistics and physics), you wrote in a way that makes it easy to understand and a joy to read. I learned an incredible amount about a very interesting subject. Thanks for the effort it took to do this, it was well worth it.
One question: As around 80% of electricity in France is from nuclear power, what's your prediction for their future in generating electricity?