Wednesday, June 02, 2010

WHY I HAVE HOPES FOR "DESIGN FOR PREVENTION"

WHY I HAVE HOPES FOR “DESIGN FOR PREVENTION”

James R. Fisher, Jr., Ph.D.
© June 2, 2010

* * *

AN ENGINEER WRITES

Some years ago, I wrote a one-page screed to Bill (author William L. Livingston) entitled Mihama Beach with a subtitle about the uniform toxic posture of management the world over. The centerline is the readiness of management to ignore outright technical requirements. The minor event I witnessed in Japan at Mihama Nuclear Generating Station Unit #2 (in 1972) presaged Three Mile Island and Chernobyl.

Norman

* * *

MIHAMA BEACH, OR MANAGEMENT IS THE SAME THE WORLD OVER!

Norman Dorn © June 2, 2010

In the early 1970s I accepted a field assignment from Westinghouse Electric to shoehorn process control computers into two nuclear power stations, Mihama Ichi-Go and Mihama Ni-Go.

These power stations (Unit 1 and Unit 2) shared buildings in a cove formed of decomposed lava, at the end of a peninsula into Wakasa Bay (Wakasa Wan) off of the Sea of Japan. The site was at Mihama-Cho (the village), Mikata-Gun (the township), Fukui-Ken (the prefecture), Kansai (the region), Honshu (the home island), Nihon or Nippon (Japan, the country).

These stations incorporated “pressurized water” type power reactors (PWRs). Unit 1 was of an earlier design (using 10’ long fuel pins) and Unit 2 was of a later design (using 12’ long fuel pins). The pressurized water reactor design uses (isotope U235) enriched fuel. Safety properties of this reactor design include self-regulating characteristics arrayed logically as defense in depth. The principal (very short reaction time) self-regulator is the so-called Doppler reactivity reduction that occurs in response to the heating of the fuel. At the point of initial criticality (first chain reaction and power production) the fuel is highly reactive as a result of being enriched, having no burn-in-created neutron poisons, and being physically cool for not having any decaying fission-products in it. The “physics testing” done during power production startup includes monitoring of the effectiveness of the “burnable” neutron poisons deliberately incorporated in the fuel and the “chemical shim” neutron poisons in solution in the pressurized primary coolant. One consequence of this startup balancing-act is that in the beginning-of-life of the initial load of fuel in a PWR, reactivity has a small positive feedback from temperature increase excursions.

* * *
The process monitoring computer in each station has the purpose of recording the operating history of the reactor and the steam turbine plant and providing guidance to the operators about station safe operation with regard to design criteria design basis events (Tech Specs).

The Unit 1 computer was installed and started up by my immediate supervisor. For fun he augmented the programming with a “task” that monitored the (neutron flux) intermediate range (log scale) sensor for an exponential flux growth rate (the external definition of a chain reaction that is just above the critical point). This task activated a “trouble location annunciator (and its associated alarm bell)”, printed an alarm message reading, in part, “Approaching Criticality, SCRAM!” and activated the computer console alarm bell. This went over like a lead cloud in a station control room full of dignitaries when it occurred at initial criticality.

I installed and started up the Unit 2 computer (which had some significant implementation changes in addition to the differences due to the station design differences), without that task. Because the prime contractor for Unit 2 had been told not to allow a repetition of that event, the contractor’s contract technicians secured (Shut Down) the Unit 2 computer for initial criticality just when it would logically have been of most value.

Similar technical contexts and similar managerial errors have been identifiable in the much more public events at Three Mile Island (Harrisburg PA, USA) and at Chernobyl, Ukraine).

* * *

DR. FISHER RESPONDS:

Norman,

I have known you and Bill Livingston for many years, and respect your engineering minds, and solid characters. DESIGN FOR PREVENTION reconnects us once again.

* * *

The engineering mind is nearly as exquisite as the mind itself. I have been a student of the discipline and a fan of practitioners for many years.

I am especially enamored of Dr. Nikola Tesla. He was the genius of many things not least of which was the World's Fair in Chicago in 1893. Edison, whom he had worked for, refused to let him use his incandescent light bulb. So, he had to invent his own practically overnight and produce it to the tune of some 200,000 bulbs to create the magnificence lighting display at that World's Fair.

* * *

In the 1980s, I had the privilege of attending a week’s seminar in New York City in which Joseph Moses Juran conducted the entire affair. He was in his mid-eighties and sharp as a tack. I never saw anyone who could use an overhead projector and create such powerful and illuminating schematics as he did that week.

Juran was an devotee of Pareto focusing on the "vital few" problems rather than the "trivial many." He was an advocate of process in identifying and handling chronic problems at their source. DESIGN FOR PREVENTION supports his thesis but in a more sophisticated manner.

Juran was an immigrant like Tesla, only he was educated in Minneapolis where he went to high school and received his electrical engineering degree from the University of Minnesota. He worked for Bell Labs, and was part of that famous team at Western Electric’s Hawthorne Works in Chicago, teaching statistical quality control. It was 1925 and he was 21. He would live to be 103.

The industrial engineer, Elton Mayo, joined the team at Hawthorne in 1927, and thus was born the “human relations movement,” and social engineering. Mayo noted that work improved whatever cosmetic changes were made. He built his thesis on manipulating workers for greater productivity, which then was refined in the human resources movement to the pusillanimous end of productivity in HR hands today.

* * *

The disconnect between context (culture) and process has failed to be repaired. Engineers have given us our modern society, but engineers have not been as attentive to the social context and cognitive biases of enterprise. Livingston treats this shortfall with care, profundity and ruthless clarity in DESIGN FOR PREVENTION.

* * *

My wife BB is business manager at a Jewish Day School. The head of school is opening up and changing severely the physical environment to teaching. Her eclectic ideas are all taken from engineering regarding openness and foresight. The problem with this strategy is that the teaching staff is old and familiar with a culture of command and control, the status quo, and business as usual.

Livingston recognizes this dilemma. He argues quite forcefully that you cannot change culture, and you can’t. But culture can change as the content, context and process of the design has a propitious time frame and a prevention strategy that takes into account proper cognizance of human factors. Culture proceeds consistent with natural law, and cannot be forced.

* * *

I have neither the competence nor the training to fully understand the toxic posture of management at the Japanese nuclear power facilities. What I do know, and Livingston’s book is a discourse on the challenge, is that engineering acumen must, and I emphasize must, consider social cultural aspects of operations as part of the DESIGN FOR PREVENTION.

Culture is not something to slough off to human resources, or industrial engineering. Culture is something that must be designed into prevention. No one understands this better than Livingston.

Engineers have created our wonderful world, and watched it blow up in our faces in the Gulf of Mexico with the BP oilrig explosion on April 20, 2010, some forty-two days ago. My wonder is if they were victims of the Abilene Paradox.

* * *

At Honeywell, I worked closely with engineers in creating the Technical Education Program, after discovering that most of the technology engineers were working on had been developed after they had left school. Moreover, the highest paid engineers were the least competent because of their obsolescent skills. I persuaded management to fund a $ million training program for all engineers that was thriving when I retired in 1990.

* * *

We are past the point of criticizing management. Engineers must think like leaders and managers or we will never get on firm ground. The irony is that many CEOs were educated as engineers, but once in management, became guilty of all the things Livingston presents in DESIGN FOR PREVENTION.

* * *

Great engineering minds have captured human folly as well as the limits of institutional protocol. Alan Turning, William Ashby, and Rudy Starkermann, among others, have observed institutions dying from entropy, and society with them. It is why I would like to see non-engineers read this book.

* * *

Many years ago I published in a SHORT-CIRCUIT NEWSLETTER an article on the “Soul of the Engineer.” In that piece, I charged engineer were conformists, comfortable in their technology but indifferent to taking a step into the world beyond. It is that world that is inextricably connected to their world today.

Not to be misconstrued, I don’t expect engineers to become social engineers. Social engineers have accelerated entropy. I expect them to “take charge” at the level of consequences, to find a way to use control theory and reconnect enterprise to purpose in an open system.

* * *

Livingston writes, “If you knew so much about the impending train wreck, why didn’t you prevent it? It is unclear why we are so willing to accept the crisis response proposals from the same people that stood back and let the crisis unfold.” Think BP and the Gulf of Mexico oilrig disaster.

DESIGN FOR PREVENTION can be read at many levels. My desire is that engineers no longer avoid conflict and confrontation, but become schooled in the politics of people as well as mathematical physics.

With confrontation avoidance, engineers create the habit of the ABILENE PARADOX. That is where the group decides on a single course of action, which every member of the group privately opposes, but embraces because it protects their jobs, is safe, and is much less wrenching. Think the BP oilrig disaster again.

* * *

No comments:

Post a Comment