Responsibility and Accountability

These are terms that get bandied about frequently these days, but their actual meaning has almost become lost through our tendency to treat them as if they were interchangeable synonyms.

I like to understand responsibility as both a management and social contract; we can hold another person to account for actions, a process, or a result, and that is the most fundamental of  building blocks for all human organisations.  It is also a basic component of all relationships, because we form mutual agreements with other people and making or breaking one’s commitments is how others can gauge our reliability.  As we keep our agreements consistently, others develop trust in us, and develop a measure of confidence in our abilities.  Accountability is external to an individual; their personal feelings on the validity of the onus are largely irrelevant.

Accountability, on the other hand, is best understood as one’s own feeling of ownership.  The tasks, roles, and outcomes that you feel are your duty to look after.  And like all feelings it is transient, and can fluctuate from day to day.

It’s possible, of course, to be held responsible for things one doesn’t feel accountable for.  F-16 pilot Major Harry Schmidt, for example, was held responsible for the deaths of Canadian soldiers at Tarnak Farm after he mistook their training exercise for enemy fire and released a weapon on their position.  Major Schmidt subsequently fought a protracted but unsuccessful battle to save his military flying career and keep details of his reprimand under wraps.  Schmidt is, I am sure, deeply affected by the knowledge that he killed friendly comrades in arms; but his public actions tell us that perhaps he doesn’t think the sudden end of his military flying days are a logical and fitting consequence of his error.

Humans by their very nature are imperfect beings; we will all make at least one error (if not several) per calendar day.  If we are disciplined, focused and fortunate, they will be minor and inconsequential lapses which will not have any long-term negative impact.  If we are at times less fortunate, disciplined, and attentive—or some disastrous combination of all three—a serious error can result in major personal or professional consequences.  But our nature—not always rational—also means that we do not like to accept responsibility for our failures, and rationalising our own purported blamelessness is something every human being on the planet has some experience with.

Bruce Landsberg, writing for the AOPA’s Air Safety Foundation eJournal, notes that pilots for mass commercial air carriers generally take their responsibilities and accountabilities quite seriously, but even with that high degree of skill, experience and training there can be regrettable (and entirely avoidable) blunders.  Mr. Landsberg laments the lack of accountability (which I understand to be the sense of ownership of the problem), pointing out that even in such a demanding and highly-skilled fraternity, it can be tempting to spread the blame to systemic or procedural faults when the obvious proximate cause is the judgment of the guy or girl in charge.

The laptop lapse in the Airbus that over flew Minneapolis was irritating. A quote from a well-known captain in the NY Times: “Something in the system allowed these well-trained, experienced, well-meaning, well-intentioned pilots not to notice where they were, and we need to find out what the root causes are, he said. Simply to blame individual practitioners is wrong and it doesn’t solve the underlying issues or prevent it from happening.”

How is it the system’s fault when two professional pilots in a perfectly functioning aircraft manage to forget that they are flying eastbound at over 400 knots and should be landing soon? When do individual practitioners who are placed in position of absolute authority and there are two of them to be sure that they are looking out for each other, come to be accountable?

— Landsberg, Bruce.  “Not my Fault, Mon! AOPA ASF Blog, 19 May 2010.

These things should interest general aviation pilots as well, because we operate less complex aircraft and often lack a copilot (or second set of eyes) as backup, should we suffer a lapse in our own judgement while flying.  The safety of the flight thus depends on how well the “system” of us—the individual GA pilot—is operating.  On a bad day, it may lead to dangerous and potentially fatal consequences.

In light aircraft with largely single pilot operations, we don’t have as many opportunities to blame “the system “ except possibly ATC.  You ARE the system and when there is a systemic problem and it wasn’t a self-inflicted wound, please file an ASRS report. Even if it was your own doing – we can all learn from such incidents.

It’s said the road to Hell is paved with good intentions and best wishes. Unlike felony law where intention does make a difference, gravity and Newtonian physics make no distinctions – it’s all about avoiding the edges of the airspace and other aircraft.

As Mr. Landsberg notes, one of the best things a pilot can do, after they have had a “what the hell was I thinking” moment, is to file an anonymous and consequence-free ASRS (Aviation Safety Reporting System) report on their own close-call incidents, so that other pilots can read the particulars and hopefully avoid being trapped in a fatal decision cycle.  Perversely, the ASRS reports get spit out as Word, Excel or CSV files, so they aren’t especially good to read at the NASA/FAA site, but the website 37,000 Feet does a serviceable job of rendering them in easily digestible web format.

Here’s one that all pilots (and heck, even ground-bound drivers) can relate to: a case of get-there-itis.  I’m sure everyone has driven into meteorological conditions where visibility is marginal at best, and the safest course of action would be to pull over and not continue; yet sometimes we press on nonetheless.  A Cessna Grand Caravan pilot with 3,300 hours under his belt was approaching the unlit Marsh Harbour Airport, Bahamas (MYAM) in fading twilight conditions.  He had to go around several times because he lost sight of the unlit runway, and eventually made a successful landing on his third attempt.  But as he later realised, because of the intense focus on getting the job done, he lost sight of the more important consideration—air safety.

At one time or another every human alive will have—if we are honest with ourselves—made decisions inconsistent with our own “best practices”,  personal values, or legal regulatory frameworks (the Highway Traffic Act, for example), and then thought better of it in a moment of sober second thought.  This is part of the burden of being a human being, imperfect by nature.  One practice we would all do well to adopt is to look first at the guy in the mirror, before we start trying to slough off blame on the people and systems around us.

RELATED: David Megginson of Land and Hold Short links to a truly astounding incident where a pilot who really ought to have known better tried to take off from Brantford, Ontario in his light twin (with an inoperative right engine) and failed to make adequate obstacle clearance.  Having an engine fail on takeoff is something that every pilot trains and prepares for, but knowing that an engine is INOP prior to starting your takeoff roll—and deciding to go anyway—is dispensing with caution perhaps a little too freely.

You can follow any responses to this entry through the RSS 2.0 feed. Both comments and pings are currently closed.

Comments are closed.