Boeing report highlights human factors no company should ignore

Financial Times

OCTOBER 7, 2019

Organisations are misreading the way people respond to stress from technology

© Bloomberg

I doubt any workplace has been subject to closer study than the cockpits of civil aircraft. It is where workers’ relationship with complex machines and with each other can be at its most intense, and the consequences of failure most catastrophic. Thanks to flight recorders, those interactions can be analysed in detail, as most high-pressure decision-making cannot.

The recent recommendations of the US National Transportation Safety Board following the fatal crashes of two Boeing 737 Max jets astonished me, therefore. The NTSB identified, in the words of its chairman, “a gap between the assumptions used to certify the Max and the real-world experiences” of the ill-fated Lion Air and Ethiopian Airlines crews. The implication was that Boeing, and its regulator the Federal Aviation Administration, in designing and approving new software for the plane had overestimated how quickly and effectively pilots and crew would respond to the multiple alerts triggered by in-flight emergencies.

Probes into the two crashes are continuing. Boeing has updated the software linked to the accidents and is refining its procedures and training. But despite decades of deep analysis of how flight crew behave under pressure, an experienced aircraft-maker apparently needed to be told to reconsider its approach. I dread to think how other less practised organisations may be misreading how people respond to the increased workload and stress imposed by technological advances.

Aviation was the birthplace of modern ergonomics, which applies understanding of interactions between people and systems to the design of those systems.

The foundation stories include that of the young US army air force psychologist Alphonse Chapanis who, at the height of the second world war, realised that B-17 Flying Fortress bombers were often crash landing because tired pilots were confusing the switch to control the wing flaps with the neighbouring switch used to retract the landing gear. What had been judged “pilot error” was in fact a design fault, relatively easily corrected.

I prefer the term “human factors”, more prevalent in the US, because skimping on ergonomics just sounds like an excuse for having ordered a batch of cheap office chairs. Neglect human factors, on the other hand, and it is pretty obvious you will be sowing the seeds of disaster everywhere from the boardroom downwards.

This risk is one reason the discipline spread from aviation to other organisations where a breakdown in the smooth interdependence of people and systems could be life-threatening: nuclear plants, hospitals, carmakers, drug manufacturers and ultimately technology companies. As their influence over how we live increases, and ever more powerful robots and algorithms are applied to bigger tasks, so do the consequences of misreading the human factor.

This is not a Trumpian call to reverse technological advances. Collaborative machines can take on drudgery and dangerous tasks that humans used to carry out. But even in the controlled space of a factory, such machines need to be designed to react correctly to their unpredictable flesh-and-blood co-workers. When such technology is introduced into the wild, the range of potential mishaps widens and increases the temptation among programmers to counter variable human factors with more technology.

Even when stationary at our desks, the distraction of ringtones and notifications can be overwhelming. It is easy to imagine how, in an emergency, multiple alerts similar to those that may have confused or distracted the crews of the doomed Lion Air and Ethiopian flights could bamboozle drivers of “semi-autonomous” vehicles, with their “autopilot” software.

The aviation industry has been here before. It had to go through a series of avoidable accidents before it realised, as late as the 1970s, that it should mitigate the risk not just with better engineering or more specialist training, but with better communication between crew members. Now crew resource management, which replaced deference to the captain with a flatter, more open, co-operative approach, is held up as a model for non-hierarchical teamwork in many other domains.

Introducing its recommendations, the NTSB pointed out that Boeing’s highly trained test pilots were used to trying out new products such as the system at the centre of the 737 Max investigations. The safety board suggested manufacturers and regulators should pay more attention to how the average pilot would react. I would like to read that as a plea to all product developers, designers and their bosses. Lower your sights occasionally from the superhuman feats that technology enables and consider how to take better account of the average humans who should be at the core of everything you do.