Now, thanks to recent revelations and the preliminary investigation by the Federal Communications Commission, we know it was not simply an employee, one of the three on-duty warning officers, during a drill who by “pressing the wrong button” or “clicking the wrong icon,” as alleged by the Hawaii Emergency Management Agency, caused the Jan. 13 false warning of a ballistic missile attack in Hawaii.
After reading the lines (and between the lines) of the nine-page FCC report and other reports, one can easily visualize the woefully inadequate, totally flawed (or even nonexistent) human, organizational and technological subsystems integration “system” of the Hawaii Emergency Management Agency which constituted the root cause of the incident.
It also confirms the undeniable fact that the front-line grunt cannot simply be considered the “cause” of the problem. Rather, it could be an effect of a complex web of contributing factors.
It isn’t simply the fault of the front-line human operator, who has not been paying enough attention or making errors, when they should have been more careful. We need to focus on the whole “work system” contributing to human error — e.g., the workstation and interface design, task dimensions, procedures, training, operator’s workload, degree of expectancy, situational awareness, uncertainty, etc.
I have seen on many occasions the control board operator falling victim to what is known in my technical fields as “design-induced errors.” What is not always mentioned is that a good majority of these “errors” or “negligence” were in fact system-induced.
According to several studies, including my own, it should be remembered throughout the investigation that error should be considered as a consequence and not necessarily a cause.
As the world-renowned scholar James Reason of England’s University of Manchester stated (and which could characterize many technological systems’ accidents, such as what happened in Hawaii’s false alarm), “Rather than being the main instigators of an accident, operators tend to be the inheritors of system defect created by poor design, incorrect installation, faulty maintenance and bad management decisions. Their part is usually that of adding the final garnish to a lethal brew whose ingredients have already been long in the cooking.”
As such, it is a gross oversimplification to attribute accidents to the actions of front-line operators or negligence of their immediate supervisors, prior to investigating all the contributing causes to the system’s failure.
My past 30 years of research at the University of Southern California into the safety of complex technological systems bears this out. It has shown that on many occasions human errors are caused by human-machine interface, human-task or human-organization mismatches.
The error and its consequences are the result of a multitude of factors, including poor workstation and workplace designs, complicated operational processes, unreasonable mental and/or physical workloads and inadequate staffing, faulty maintenance, ineffective training, nonresponsive managerial systems, unhealthy safety culture, dysfunctional organizational structures, and haphazard response systems.
The safety culture, furthermore, goes beyond specific rules and rote adherence to standard operating procedures in any organization. In other words, creating safety culture means instilling attitudes and practices in individuals and organizations that ensure safety concerns are proactively addressed and treated as high priority.
An organization fostering strong safety culture encourages employees to cultivate a questioning attitude, a prudent approach to all aspects of their jobs and creates open communication between workers and management.
Although human errors are undesirable, and accidents are sad phenomena that could have long-lasting impacts, there are many lessons society and industry can learn from them. We should realize we cannot uproot human error, but we can (and should), at least, thoroughly study our emergency-alerting ecosystem and try to proactively identify error-producing potential and minimize its occurrences.
“It is astonishing that no one was hurt” in Hawaii, said Michael O’Rielly, a commissioner of the FCC. Nevertheless, the false emergency alert and the serious system-oriented problems of the Hawaii Emergency Management Agency should be a rude awakening for the rest of the country and remind us, once more, of what the Nobel physicist Richard P. Feynman said in the context of a technological system failure, when the space shuttle Challenger exploded in 1986: “When playing Russian roulette, the fact that the first shot got off safely is little comfort for the next.”
Najmedin Meshkati, a professor of civil/environmental engineering and industrial systems engineering at the University of Southern California, teaches and conducts research on human-systems integration and safety culture in complex technological systems failures.
“It has shown that on many occasions human errors are caused by…. human-
organization mismatches…ineffective training, nonresponsive managerial systems, unhealthy safety culture, dysfunctional organizational structures”
When failure & incompetence is not punished,
and success & integrity not rewarded,
the lowest common denominator of $hitty performance is the inevitable result.
Our best stop making an effort, or quit, as they figure “why bother?”, and our worst get away with $hit forever.
It’s a union thing, a Hawaii Democrat thing, a crony-progressive-fascist thing, and it’s why our schools & hospitals suck, too.
Laissez-faire fat cats, Hawaii –
one of you needs to put a sign on your desk saying
“The Buck Stops Here”.
The article’s aim is to shed light on the scientific side of the issue, to show that behind all these controversies and pinpointing, there is a vast body of knowledge which can and must be used to identify the weak points of the alarm system and to implement countermeasures to prevent human errors. The whole point of the article is to clarify that the false alarm was not a person’s wrongdoing, it was an obvious outcome of a poorly designed system that led to the false alarm. Punishing or firing people will not solve the issue, it only covers the incompetency of the system designers for a while…