ONE WAY to tell who made the aircraft you are boarding is to steal a glimpse of the cockpit. A traditional control yoke in front of the pilots suggests a Boeing; a joystick beside each seat, an Airbus. Pilots argue about which system is better; neither is considered safer than the other. Each exemplifies a different approach to a problem that manufacturers of not just aircraft but also cars, trains and ships must grapple with as long as human operators handle increasingly automated machines.
The challenge of what engineers call the “human-machine interface” has tragically gained attention after the crash of an Ethiopian Airlines Boeing 737 MAX 8 on March 10th. Eyewitnesses reported that shortly after departing Addis Ababa, the aircraft climbed and dived repeatedly. Similarities were drawn with a fatal crash in Indonesia in October last year. That time, the pilots of a Lion Air MAX 8 struggled, also soon after take-off, with an automated safety system that erroneously tried to prevent the aircraft from stalling by lowering its nose.
Although authorities around the world have grounded the model, Boeing insists that it is airworthy. The company is updating the MAX’s automated flight-control software to make it easier for pilots to assume manual control. Boeing and Airbus both pack their planes with computers that do most of the flying. Each, though, espouses a different philosophy on how a pilot reacts to them, says Mudassir Lone of Cranfield University in Britain. Boeings are designed to make the pilot feel like the aviator in charge. Although the control yoke looks and feels like something from the analogue era, the way it behaves—including shaking when approaching a stall—is created digitally by a computer. Airbus’s joystick is seldom used besides take-off and landing. A sound alerts the pilot to trouble; in an Airbus, he is more supervisor than airman.
The big worry is what happens if a sensor feeds the flight-control system the wrong data. This might have happened in the Lion Air crash, according to a preliminary report. Something similar downed an Air France Airbus A330 over the Atlantic in 2009: an airspeed sensor iced over and the ensuing loss of data caused the autopilot to disengage. Unable to work out what was happening, the pilots lost control.
Switching from automatic to manual is not straightforward. Flight-control systems may not disengage entirely. Instead, they might continue to assist the pilot in an attempt to prevent a dangerous manoeuvre. When things do go wrong, it is critical that pilots follow the correct procedures, which are different for each model of aircraft. Pilots learn these and carry checklists spelling them out. Proliferation of systems necessitates frequent retraining. To make life easier for pilots, the MAX 8 employs a system that makes it feel to them like older, more familiar versions of the 737. But this adds another layer of complexity.
Incidents are not confined to aviation. In Washington, DC, automated trains have largely been out of service since 2009, when a faulty circuit made a stationary train invisible to the safety system on the one behind it. The driver was unable to brake in time; the resulting crash killed nine people. Ships may soon face similar problems. Some ferries and offshore support vessels have already replaced ship’s wheels with computer-assisted joysticks. A series of accidents involving self-driving cars may have been caused by sensors’ failure to recognise objects in the road, and drivers failing to respond fast enough.
Studies have shown that when people have to wrest control from an automated system, it can take them around five seconds to grasp what is happening. The monotony of monitoring a semi-automated vehicle may reduce vigilance by provoking what psychologists refer to as “passive” fatigue. Such concerns have led some carmakers, Ford among them, to consider skipping semi-automation and go straight to something closer to full autonomy, cutting people out of the loop. That would remove the human-machine interface—but not humans’ machine-induced fears.