UTILITIES SECTOR

 
 

The utilities industry manage risk on a daily basis.  Commercial pressures must be qualified with a healthy regard for safety.  A robust operation should not only focus on the technical issues, but also the non technical challenges associated with working in challenging environments. 
 

Organisations that loose focus on safety, risk loosing public confidence and hence the viability of their company.
A vital part of the safety jigsaw is human performance

 
 

 
 
 

THREE MILE ISLAND CASE STUDY

 

The Three Mile Island accident was caused by a nuclear meltdown that occurred on March 28, 1979, in reactor number 2 of Three Mile Island Nuclear Generating Station (TMI-2) in Dauphin County, Pennsylvania, United States. It was the most significant accident in U.S. commercial nuclear power plant history. 

The accident began with failures in the non-nuclear secondary system, followed by a stuck-open pilot-operated relief valve in the primary system, which allowed large amounts of nuclear reactor coolant to escape. The mechanical failures were compounded by the initial failure of plant operators to recognize the situation as a loss-of-coolant accident due to inadequate training and human factors, such as human-computer interaction design oversights relating to ambiguous control room indicators in the power plant's user interface.

Critical human factors and user interface engineering problems were revealed in the investigation of the reactor control system's user interface. Despite the valve being stuck open, a light on the control panel ostensibly indicated that the valve was closed. In fact the light did not indicate the position of the valve, only the status of the solenoid being powered or not, thus giving false evidence of a closed valve As a result, the operators did not correctly diagnose the problem for several hours.

The design of the pilot-operated relief valve indicator light was fundamentally flawed. The bulb was simply connected in parallel with the valve solenoid, thus implying that the pilot-operated relief valve was shut when it went dark, without actually verifying the real position of the valve. When everything was operating correctly, the indication was true and the operators became habituated to rely on it.

However, when things went wrong and the main relief valve stuck open, the unlighted lamp was actually misleading the operators by implying that the valve was shut. This caused the operators considerable confusion, because the pressure, temperature and coolant levels in the primary circuit, so far as they could observe them via their instruments, were not behaving as they would have if the pilot-operated relief valve were shut. This confusion contributed to the severity of the accident because the operators were unable to break out of a cycle of assumptions that conflicted with what their instruments were telling them. It was not until a fresh shift came in, who did not have the mind-set of the first shift of operators, that the problem was correctly diagnosed. By this time major damage had occurred

 

 
Plane Icon 3.jpg
Bespoke Keynote Speaking
 
Flight Simulator Away Day
Plane Icon 3.jpg
4 Individual Masterclasses
 

 

WIDER INDUSTRIAL INCIDENTS

 

Tenerife Air Disaster

Human error: assertiveness skills, hierarchy gradient, communication error

On March 27, 1977, two Boeing 747s passenger
jets collided on the runway at Tenerife North Airport.

The steep hierarchy gradient within the KLM cockpit, allowed an error made by the senior Captain to remain unchallenged by the junior Co Pilot and Engineer.

The crash killed 583 people, making it the deadliest accident in aviation history. 

Southall Train Crash

Human error: situational awareness, poor sustained attention

The train crash occurred after the 10:32 Great Western Trains passenger train from Swansea to London Paddington hit a freight train. The Great Western Train departed with a defective Automatic Warning System (AWS), and then passed a red (danger) signal, preceded by two cautionary signals.
The high speed train driver failed to notice the danger signals. 7 people died and Great Western Trains were fined £1.5 billion.

Costa Concordia

Human error: rule violation, weak leadership, assertiveness by the ships bridge crew

The cruise liner deviated from its intended GPS route in order to sail close by a former Captains home on the Italian island of Giglio. The huge vessel struck rocks and capsized.
The ships Captain Francesco Schettino showed a reckless violation of company rules, disabled warning systems and showed poor leadership in the evacuation of passengers.