I have just spent the past hour reading through the technical specifications for the post-Fukushima European stress tests, designed to assess the potential impact of extreme events challenging plant safety functions. The Western European Nuclear Regulators’ Association, WENRA, which seems to be the main author, has done a thorough job.
The idea of a stress test is clever, since it targets what some people might see—particularly since Fukushima—as a blind spot in nuclear safety. The safety risks of many normal incidents will already be covered in the plant’s safety case. But its response to some kind of extreme event will not have been studied in as much detail as the more likely, so-called ‘design basis’ risks. So the European stress test scenarios should avoid duplicating previous work.
The time and effort involved in developing, carrying out and reviewing stress tests can be justified by the nuclear industry’s learning culture, generating lessons learned wherever and however incidents might occur. And it is perfectly natural for the event to have shaken the general public’s faith in nuclear power. The unlikeliness of extremely unlikely events is hard to justify when one has just happened.
The public’s concern may be perfectly understandable; but it doesn’t make the risk any more real. The temptation to think it does is the fallacy known as the ‘availability heuristic’; we tend to worry in proportion to our depth of feeling, independent of its actual probability. For example, I’m frightened to walk out into my dark garden after watching a scary movie. In my head I know that the chance of finding an axe-wielding manic hiding in the hydrangeas is probably about the same that it was two hours before. But I’m still nervous when I open the door.
Why have regulatory groups around the world spent so much time on the circumstances that led to the Fukushima disaster? The technical imperative is to make sure that Fukushima has not revealed some kind of nuclear industry blind spot. The political imperative is to be seen to be doing something to respond to the events, to counteract the distressing news coverage of an INES level 7 disaster, and so rehabilitate the image of civil nuclear power.
All of this work, the studies, conferences, publications, recommendations and rulings, will probably improve nuclear safety. But I doubt it is the most cost-effective use of our industry’s time and energy.
Can I just use this opportunity to advance what may seem, post-Fukushima, a radical notion? I believe that beyond-design-basis events are less likely to occur than DBA ones. Therefore, I think what we should really examine carefully are the near misses, where the safety systems of an operating reactor broke down.
Take, for example, an October 2010 valve failure that did not lead to an accident. A low-pressure coolant injection valve failed to open at a US BWR 4 reactor of similar vintage to Fukushima Daiichi when operators attempted to use the residual heat removal shutdown cooling loop during refuelling. The failure of the valve could have threatened the safety of the reactor in an emergency scenario. To be honest, I only learned about the incident, at Tennessee Valley Authority’s Browns Ferry 1, because the US Nuclear Regulatory Commission flagged it as an event of high safety significance. Although the valve was repaired before the unit went back into service, and the plant continued to operate safely, “significant problems involving key safety systems warrant more extensive NRC inspection and oversight,” said Victor McCree, in May. He is an NRC regional administrator.
Despite the NRC’s laudable efforts, I doubt that faults like this one will be taken as seriously as all the dirt dug up on Fukushima Daiichi’s utility Tokyo Electric Power Corp during the course of post-event investigations. If so, that is a pity. There may have been operational and equipment problems at Daiichi. But to a significant degree all of these concerns are beside the point, since the meltdown would not have occurred without the huge, and hugely improbable, tsunami.
Unfortunately, it seems to be human nature to find what did happen much more compelling than what might have happened.
Will Dalrymple, editor