It is always pilot error
The aviation community has been buzzing with speculation and commentary around the recently Boeing 737 MAX 8 plane crash in Ethiopia and the model’s subsequent grounding around the world. Watching this news report I was struck by the following quote from the “Deputy Assistant Secretary of State” regarding a similar crash in Indonesia:
“The plane should have never gone down if the pilot’s had just been following procedures”
Fundamentally, as pilot in command we take full and complete responsibility for the fate of our aircraft and passengers every time we start the engines. So yes, while this statement may be by-the-book correct, it is still a completely irresponsible perspective in aviation safety.
Any system for which a human is solely responsible for its safe operation would be wildly reckless. Humans make so many mistakes, many of them entirely predictable and avoidable with the appropriate checks, procedures, and overlapping fail-safes.
The only example in my lifetime where I could make the argument that the crash was only due to “pilot error” is the Germanwings Flight 9525 crash where the co-pilot of the aircraft deliberately smashed the aircraft into a mountain. Even then, systems of safety may have prevented the crash, such as the standard procedures followed by airlines in the US which require two individuals to always be in the cockpit.
I think about this topic a lot when it comes to the software that we build. It is more and more difficult for me to call something “user error” these days, since the problem at hand is often times easily preventable by improved safeguards, clearer interfaces, and sane defaults. A perspective shared by Jordan Sissel comes to mind:
“If a new user has a problem, it’s a bug in the code or the documentation”
These two lines of thought start to interleave since early indications are pointing towards software failures contributing to the crashes of these two aircraft and the deaths of hundreds of passengers. Perhaps it was a single faulty sensor, or an incorrect automated overcorrection, or the pilot not being made aware of some condition and choosing the wrong responsive action. Ultimately yes, I believe we can make the claim that every crash is the pilot’s fault, but that does not release the manufacturer and the engineers of the various systems from blame for not building safer systems which ensure that the lives of hundreds do not depend on a single human decision.