WHAT BOEING’S 737 MAX HAS TO DO WITH CARS: SOFTWARE

SOFTWARE EATING THE world may sound good to tech mavens. But the now eight-year-old maxim has its serious downsides. Software defects have been blamed for Boeing737-MAX 8 crashes in October and March, which killed 346 people. The aircraft has been grounded worldwide for three months, as investigators from Indonesia (where the first plane crashed), Ethiopia (where the second plane crashed), and the US National Transportation Safety Board and Federal Aviation Administration work to determine why the airliners went down—and how they might be fixed.

AARIAN MARSHALL COVERS AUTONOMOUS VEHICLES, TRANSPORTATION POLICY, AND URBAN PLANNING FOR WIRED.

This week, pilots working with the FAA flagged another issue with the airplane, which will likely delay its return to service until September or October. According to The Wall Street Journal, the issue stems from a lack of redundancy. Federal investigators reportedly found during simulated flight tests that if a chip inside the flight-control computer fails, it might cause a panel in the airplane’s tail to move, pushing its nose downward. Investigators reportedly found the defect while testing the airplane under very unusual situations, but the FAA is requiring that Boeing fix it before it allows the 737 MAX to fly again.

“Boeing will not offer the 737 MAX for certification by the FAA until we have satisfied all requirements for certification of the MAX and its safe return to service,” the aircraft manufacturer said in a statement. Boeing reportedly believes this chip issue can be fixed with a software tweak (though some experts disagree). The company did not respond to a request for comment.

A March preliminary report from Indonesia’s aviation authority about the October crash pinned the plane’s trouble on software. It points to a software system called the Maneuver Characteristics Augmentation System. The 737 MAX’s engine placement is higher and further out on the wing than previous generations of the airplanes, which under certain situations can force the airplane’s nose up, increasing the likelihood of stall. The MCAS system detects when that erroneous pitch occurs at high speeds and uses the stabilizer on the airplane’s tail to move the nose back down. On the downed planes, a faulty sensor may have triggered MCAS when it shouldn’t have, leading the pilots to wrestle with the planes as they struggled to pull their noses back up.

Which is all to say: Building perfect software is hard, and testing it for faults is complicated. “I think there isn’t anything that makes finding defects in aircraft software uniquely difficult. Rather, finding subtle defects via testing is difficult in all software,” says Philip Koopman, a professor of electrical engineering at Carnegie Mellon University and the CTO of the startup Edge Case Research, which tests safety-critical software for defects.

Even so, the creators of aviation software have gotten pretty good at it. In 2018 a commercial aviation accident occurred every 740,000 flights, with one involving a major jet happening every 5.4 million flights, according to the International Air Transport Association. In fact, deadly software defects have been more commonly associated with automotive crashes than airplane crashes. Automotive recalls linked to electronic and software failures jumped 30 percent a year between 2012 and 2016, according to the consultancy AlixPartners (though federal data shows that, in recent decades, vehicles have become safer for their occupants).

Koopman doesn’t have any inside knowledge on the Boeing 737 MAX crashes, but he says software issues in both sorts of transportation machines probably stem from a common engineering principle: The more safety-critical an element of the software is thought to be, the more rigorously it is built and tested. The problem with both automotive and aviation software comes when engineers determine an element isn’t safety-critical—and then it turns out to be.

Airplane software is more likely to be viewed by engineers as safety-critical, Koopman says. After all, a failure generally means the thing will fall out of the sky. That might help explain why you see fewer crashes linked to airplane software issues than you do those linked to automotive software. (Other explanations: There are way more cars in the world than airplanes, and pilots face more rigorous training than your average driver.)

Still, the explanations for software defects like those found in the Boeing aircraft and those found in vehicles may be similar. Advanced driver-assistance features like Tesla’s Autopilot and General Motors’ Super Cruise assume a human is paying attention to the road and is ready to take over if their automated lane-changing or forward-collision features fail. (These systems do have varying—and controversial—methods of ensuring that drivers are indeed paying attention.) But if a software bug prevents pilots or drivers from resuming control of the machine, “that’s a big problem that can result in fatalities,” Koopman says.

Fortunately for anyone who flies into the sky in Boeing airplanes, it appears the 737 MAX is now getting the top-to-bottom safety and engineering review it needs. Let’s hope the same happens for all software that helps people get around.

[“source=wired”]