Red flags

I've been following with interest the controversy over Boeing 737 Max 8 and 9 aircraft, without being able to make out whether the aircraft or pilot training is the biggest problem.  HotAir's Jazz Shaw published a startling piece this morning reporting that the Lion Air flight that went down last year had narrowly escaped almost exactly the same fate the day before.  They were saved by the coincidental presence in the cockpit of an off-duty co-pilot who correctly diagnosed the problem and told the crew how to disconnect the malfunctioning flight control system.

There must be strong pressures to put the cone of silence over near-misses like this. Still, how would you like to be the guy who didn't speak up, or the guy who squelched his report?

14 comments:

E Hines said...

It's interesting to see the difference in outcome between high-time and well-trained pilots and low-time pilots given only lick-and-a-promise training. American airlines have flown the MAX for some years without incident.

None of which means Boeing shouldn't be required actually to fix its hardware and software problems, as this Twitter thread implies: https://twitter.com/trevorsumner/status/1106934362531155974

David Foster said...

In the United States, there is a system called the ASRS, the Aviation Safety Reporting System. PIlots, mechanics, and controllers are encouraged to submit anonymous reports about safety-related concerns. The better to protect anonymity, the ASRS is administered by NASA rather than by the FAA.

Other countries would do well to have something similar...However, the ASRS database is publicly-available, so one would hope the aviation authorities, airlines, and manufacturers in those countries would be regularly browsing the reports relevant to their operations.

David Foster said...

Re the Twitter thread: it seems pretty legalistic to say that something was not a software problem because the software followed the specification accurately, even when the specification was bad. "Software problem" should encompass the specification and suitability for task.

E Hines said...

Following spec was addressed in the next sentence: The specification was just sh*y.

I didn't have the impression he was deprecating the software so much as he was hammering the spec. As a project manager who wrote requirement specs and a test director who wrote test specs, maybe I'm just spring-loaded to give weight to specifications.

It's true that it's not enough when software meets specification, but that's all software and the programmers who write it can be expected to achieve. The main software responsibility is with the specification writers and the testers.

Eric Hines

David Foster said...

"The main software responsibility is with the specification writers and the testers."

Yes, I agree with that. Point I was trying to make is that successful software creation involves a lot more than "coding", and hence, software problems are not always coding problems.

David Foster said...

The fix for this system will apparently include logic to trigger the nose-down action only if redundant sensors agree that the angle of attack is too high.

There is also a potential safety problem with this. If there are 2 sensors, then failure of *either* of them could lead to a situation in which a *genuine* problem with high angle attack would fail to trigger the remedial action (since the two measurements would not agree)

I haven't worked through the probabilities, and don't have the data to do so, but pretty sure that this would still be a safety improvement from the current situation.

THREE sensors, with a majority vote, would be best.

E Hines said...

THREE sensors, with a majority vote, would be best.

Respectfully disagree. Best would be training the pilot to go manual and attend to his analog sensors--which still should be on board, functional, and deiced. Sensors like altimeter, attitude indicator, airspeed indicator, heading indicator, and turn/slip indicator. Those give all the information that's needed to fly the airplane out of trouble and to an airport to land.

In fact, the altimeter, airspeed, and heading are all he really needs. Those functioning, and the basic flight training to use them, even would have saved a Brazil flight to Europe a few years ago (though some basic crew coordination would have had a favorable impact, too), let alone to MAX rides in the last few months.

Eric Hines

David Foster said...

I totally believe all pilots should keep up hand-flying skills...but it is still worthwhile to have systems which identify problems and in some cases take immediate action. "Stick pushers" were introduced decades ago to push the nose down in an immediately-incipient stall situation; I believe they have been used mainly on airplanes that have extremely evil stall-recovery characteristics.

A stick pusher will forcibly push the stick (usually actually a yoke) forward; the pilot can pull against it but it takes some strength. (and then can turn it off if actuated in error (and there is still altitude)) Don't think it could be missed, unlike the 737 Max 8 system which rapidly adjusts the trim and *can* be missed by the flight crew. Would be interesting to understand Boeing's reasoning for using the system they did as opposed to a stick pusher system.

I've seen allegations in various places that the 737 MCAS system can't be turned off without going through a complex procedure involving turning off the entire bus bar that powers it. Pretty sure that's wrong; there are trim disconnect switches on the center console. The problem isn't any difficulty with turning off the MCAS; it's with recognizing what's happening quickly enough.

E Hines said...

On the Lion Air crash, supposedly an experienced pilot deadheading in the engineer's seat on the same aircraft the day prior saved the aircraft from an identical crash situation by walking the pilot-in-command through the procedure for deactivating the MCAS: flipping two switches. Evidently, too, the switches were not intuitively apparent; it took training and experience to know which two. And knowing the checklist, which also listed the switches to be turned off in such situations.

Most fighters are, by design, inherently longitudinally unstable (and easily rolled, but stably). Not even the F-22 uses such a thing as an MCAS or any other autotrim. Runaway trim situations result from stuck switches or trim software that must first be triggered by the thumb on the trim switch. My impression is that the F-35 and the AV-8 don't mess with such nonsense, either. Airliners are, by design, much more stable and shouldn't need such things.

My impression of the MAX is that Boeing tried a software fix to a hardware problem: engines occupying more volume and sitting farther forward than the fuselage was designed to handle. That the system worked in the US suggests that training and time-in-type matter. Whether the system requires too much experience and training for the "average" national airline system is a discussion worth having.

It's also highly useful to have systems and sensors telling--or suggesting to--the pilot the aircraft's status and behavior, but surrendering pilot duties to a computer is just foolish. That might be necessary for commanding a rocket--that's a system that's wildly sensitive to tiny thrust vector variations, but subsonic, even mildly supersonic, air-breathers don't work well under computer control. Look how hard it is to build a self-driving car.

Eric Hines

douglas said...

As we've become more and more infused with technology, we've blurred the lines of where it begins and we end. Blurring that interface point is what caused these crashes (well, that and not doing anything to resolve that, and forcing a design to work). In my completely uninformed and amateur opinion, both engineers and pilots/trainers don't give proper attention to the interface points, and making sure the humans who will be using the machines know where those points are. In this case, pilots not knowing where that particular point was, and engineers not being concerned with how they would know where that point was came together with catastrophic results.

Robert Macaulay said...

One of the cures for bad software is a good flight crew.
Good flight crews are expensive. Inadequate flight crews are even more expensive.

E Hines said...

One of the cures for bad software is a good flight crew.

They don't cure anything, they just make it possible to get away with a fair amount of bad software. The "correct" response to the MAX' MCAS malfunction was to turn it off, not to cure it.

Flying, at bottom, really is as simple as pushing forward to go down, pulling back to go up, left/right just like driving a care. Except that as we demand more of our aircraft, better sensors, better software, better fusion helps a lot and often is critical.

To Douglas' point just above, that interface between pilot and technology isn't only blurred--mostly by bad training, I say--it's susceptible to being distracted from. Just look at the extraneous technology in our cars that encourage drivers to do things other than driving. The distractions in the cockpit are different--complexity of the pilot/aircraft interface (just with the HOTAS of an F-22, there are some 20+ buttons and switches, most of which are multi-axis operable, and one pilot with whom I worked referred to the much simpler F-16 HOTAS as a piccolo dance) and the fear induced by a badly misbehaving aircraft just to name a couple--but no less real, and they put a very large premium on training and experience.

Eric Hines

Texan99 said...

I don't know when I've read a set of more interesting comments.

Robert Macaulay said...

Mr. Hines, you spent the time to much more thoroughly elucidate the point I was trying to make quickly. Thanks.

Tex, I agree. One reason I lurk here is the comments section. I also enjoy the occasional reference to Kipling.