It isn’t often you get to experience and walk away from a CFIT mishap, especially in a high-performance fighter.
Tag-Archive for » air safety «
All right, I’m impressed.
A brand new 747-8 performs an RTO (rejected takeoff) test at MTOW (maximum takeoff weight) using fully worn brakes, without reverse thrust, and still manages to stop 700 feet earlier than projected.
Read more about it at the manufacturer’s website.
Time for a palate cleanser. Videographer Jordi Blumberg has filmed many terrific sequences of aircraft and airside operations around London Gatwick Airport; here is one featuring a runway check.
Airport operators are required to check surface movement areas (aprons, taxiways and runways) for FOD—Foreign Object Debris which can cause critical damage to an aircraft’s engines. In practice this means that every hour, an airport utility vehicle goes bolting down the runway (with anti-collision lights flashing), looking for dangerous bits of metal that might have fallen off other aircraft and could potentially cause damage to the next departing or arriving plane.
A Qantas aircrew (and passengers) rediscover the old axiom that it’s better to be lucky than good. Though it helps if you have an ample supply of both.
LEAKING water knocked out electricity to a number of systems during a Qantas 747’s flight to Bangkok, forcing the crew to land using limited battery power in a race against the clock.
…As a result of the leak many of the aircraft‘s communication, navigation, monitoring and warning, and flight guidance systems were affected.
Had the event occurred more than 30 minutes flying time from the nearest suitable airport, or if there had been a delay prior to landing, numerous flight-critical systems would have become unavailable, placing the flight at “considerable” risk, air safety investigators warned.
— Schneider, Kate. “Qantas jet’s ‘lucky escape’ after water leak.” News.com.au, 13 December 2010.
The plane had a 30-minute battery reserve powering the avionics bay, and they landed having used 21 minutes of it.
These are terms that get bandied about frequently these days, but their actual meaning has almost become lost through our tendency to treat them as if they were interchangeable synonyms.
I like to understand responsibility as both a management and social contract; we can hold another person to account for actions, a process, or a result, and that is the most fundamental of building blocks for all human organisations. It is also a basic component of all relationships, because we form mutual agreements with other people and making or breaking one’s commitments is how others can gauge our reliability. As we keep our agreements consistently, others develop trust in us, and develop a measure of confidence in our abilities. Accountability is external to an individual; their personal feelings on the validity of the onus are largely irrelevant.
Accountability, on the other hand, is best understood as one’s own feeling of ownership. The tasks, roles, and outcomes that you feel are your duty to look after. And like all feelings it is transient, and can fluctuate from day to day.
It’s possible, of course, to be held responsible for things one doesn’t feel accountable for. F-16 pilot Major Harry Schmidt, for example, was held responsible for the deaths of Canadian soldiers at Tarnak Farm after he mistook their training exercise for enemy fire and released a weapon on their position. Major Schmidt subsequently fought a protracted but unsuccessful battle to save his military flying career and keep details of his reprimand under wraps. Schmidt is, I am sure, deeply affected by the knowledge that he killed friendly comrades in arms; but his public actions tell us that perhaps he doesn’t think the sudden end of his military flying days are a logical and fitting consequence of his error.
Humans by their very nature are imperfect beings; we will all make at least one error (if not several) per calendar day. If we are disciplined, focused and fortunate, they will be minor and inconsequential lapses which will not have any long-term negative impact. If we are at times less fortunate, disciplined, and attentive—or some disastrous combination of all three—a serious error can result in major personal or professional consequences. But our nature—not always rational—also means that we do not like to accept responsibility for our failures, and rationalising our own purported blamelessness is something every human being on the planet has some experience with.
Bruce Landsberg, writing for the AOPA’s Air Safety Foundation eJournal, notes that pilots for mass commercial air carriers generally take their responsibilities and accountabilities quite seriously, but even with that high degree of skill, experience and training there can be regrettable (and entirely avoidable) blunders. Mr. Landsberg laments the lack of accountability (which I understand to be the sense of ownership of the problem), pointing out that even in such a demanding and highly-skilled fraternity, it can be tempting to spread the blame to systemic or procedural faults when the obvious proximate cause is the judgment of the guy or girl in charge.
The laptop lapse in the Airbus that over flew Minneapolis was irritating. A quote from a well-known captain in the NY Times: “Something in the system allowed these well-trained, experienced, well-meaning, well-intentioned pilots not to notice where they were, and we need to find out what the root causes are, he said. Simply to blame individual practitioners is wrong and it doesn’t solve the underlying issues or prevent it from happening.”
How is it the system’s fault when two professional pilots in a perfectly functioning aircraft manage to forget that they are flying eastbound at over 400 knots and should be landing soon? When do individual practitioners who are placed in position of absolute authority and there are two of them to be sure that they are looking out for each other, come to be accountable?
— Landsberg, Bruce. “Not my Fault, Mon!” AOPA ASF Blog, 19 May 2010.
These things should interest general aviation pilots as well, because we operate less complex aircraft and often lack a copilot (or second set of eyes) as backup, should we suffer a lapse in our own judgement while flying. The safety of the flight thus depends on how well the “system” of us—the individual GA pilot—is operating. On a bad day, it may lead to dangerous and potentially fatal consequences.
In light aircraft with largely single pilot operations, we don’t have as many opportunities to blame “the system “ except possibly ATC. You ARE the system and when there is a systemic problem and it wasn’t a self-inflicted wound, please file an ASRS report. Even if it was your own doing – we can all learn from such incidents.
It’s said the road to Hell is paved with good intentions and best wishes. Unlike felony law where intention does make a difference, gravity and Newtonian physics make no distinctions – it’s all about avoiding the edges of the airspace and other aircraft.
As Mr. Landsberg notes, one of the best things a pilot can do, after they have had a “what the hell was I thinking” moment, is to file an anonymous and consequence-free ASRS (Aviation Safety Reporting System) report on their own close-call incidents, so that other pilots can read the particulars and hopefully avoid being trapped in a fatal decision cycle. Perversely, the ASRS reports get spit out as Word, Excel or CSV files, so they aren’t especially good to read at the NASA/FAA site, but the website 37,000 Feet does a serviceable job of rendering them in easily digestible web format.
Here’s one that all pilots (and heck, even ground-bound drivers) can relate to: a case of get-there-itis. I’m sure everyone has driven into meteorological conditions where visibility is marginal at best, and the safest course of action would be to pull over and not continue; yet sometimes we press on nonetheless. A Cessna Grand Caravan pilot with 3,300 hours under his belt was approaching the unlit Marsh Harbour Airport, Bahamas (MYAM) in fading twilight conditions. He had to go around several times because he lost sight of the unlit runway, and eventually made a successful landing on his third attempt. But as he later realised, because of the intense focus on getting the job done, he lost sight of the more important consideration—air safety.
At one time or another every human alive will have—if we are honest with ourselves—made decisions inconsistent with our own “best practices”, personal values, or legal regulatory frameworks (the Highway Traffic Act, for example), and then thought better of it in a moment of sober second thought. This is part of the burden of being a human being, imperfect by nature. One practice we would all do well to adopt is to look first at the guy in the mirror, before we start trying to slough off blame on the people and systems around us.
RELATED: David Megginson of Land and Hold Short links to a truly astounding incident where a pilot who really ought to have known better tried to take off from Brantford, Ontario in his light twin (with an inoperative right engine) and failed to make adequate obstacle clearance. Having an engine fail on takeoff is something that every pilot trains and prepares for, but knowing that an engine is INOP prior to starting your takeoff roll—and deciding to go anyway—is dispensing with caution perhaps a little too freely.
Mr. Bruce Lansberg (President of the AOPA Air Safety Foundation) writes at the AOPA ASF Blog, and decries recent calls for general aviation to adopt higher standards and a more rigorous regulatory regime akin to that of mass-market commercial carriers.
This is an area where one’s pilot rating will tend to determine how one views the issue; ATP-rated pilots will generally view greater rigour as no big deal and an essentially good idea, while non-ATP-rated folks are likely to view it as straining at gnats, where one will soon encounter the law of diminishing returns.
My own sense is that having GA pilots be subject to commercial-carrier-type regulation would be akin to having ordinary drivers have to conform to the mechanical, technical and certification regulations governing buses and commercial transport trucks. It might in the end make the roads a little bit safer, but would the added inconvenience and expense be a worthwhile trade-off?
Speaking from my own experience, I’ve flown with very meticulous, by-the-book GA pilots who (for example) always check the weather forecasts; always perform a precisely detailed preflight and walk-around; always give passengers a detailed safety briefing including instructions for use of the ELT; always perform the placarded checklists at the appropriate points, and so on. I’ve also flown with GA pilots who omitted one or two (or more) of those steps, and who I probably wouldn’t trust to take a car to the corner store.
I’ve only been in two situations where I felt a GA pilot endangered my life. In both of those cases, I’m not sure additional regulation would have helped since they stemmed from an experienced pilot’s failure to recognise the increased risk arising from specific flight conditions. Conditions which they are educated about and taught to avoid already! One was a definite case of get-there-itis; a multiengine- and instrument-rated instructor pilot (who should have known better) continued flight into deteriorating winter weather conditions and a geographic locality (the middle of Lake Michigan) which provided no forced landing alternatives should an engine quit or otherwise force us down. To make things worse, the freezing layer was so low we were down to a few hundred feet above the wavetops; if we had to ditch, we would barely have time to broadcast a position report before we got dunked in the freezing water.
Part of pilot training is giving the individual tools to make good judgments under a wide array of situations, but no educational method can guarantee a human will always be able to identify the relevant data points to arrive at the right conclusion.
This Strike Eagle maintenance crew chief just saved one of his birds from a potential mishap on takeoff:
“We were watching them taxi and I noticed when the second jet made its turn the left rudder was fully deflected to the right and the right rudder was perfectly straight after it made the turn,” said Sergeant [Justin] Wilson, deployed from Royal Air Force Lakenheath, England. “I knew that the pilots were not actually making this happen and something must be wrong.”
…After seeing the potential broken jet at the end of runway area receiving final preparations for takeoff, Sergeant Wilson ran to the area to inform the crew of the problem and advise them to send the jet back to its parking area.
— Williams, Richard (Staff Sgt, USAF). “Airman’s vigilance prevents aircraft mishap.” 455th Air Expeditionary Wing Public Affairs, 19 March 2010.
I’m sure Sgt. Wilson got a beer or two from a grateful aircrew for his alertness and quick action.
Every person will, at some point, encounter an extraordinary situation in which regulations or prior training will incline them to take one course of action, but the specifics of the scenario will lead their instinct to override it and choose another. Most of us will not be placed in a situation where that call is time-critical and the course of hundreds of lives will depend on the outcome.
On January 17th, 2008, the flight crew of ill-fated Speedbird 38 (BA038) made a last-minute adjustment to their flap settings, opting to extend their touchdown zone rather than have the guts ripped out of their crippled steed by Runway 27L’s localizer array and approach lighting.
Captain Peter Burkill altered the flap settings to reduce drag when the Boeing Co. 777 was only 240 feet above the ground, the U.K. Air Accidents Investigation Branch said in a report today. That delayed the impact for 50 meters (164 feet) and the plane came down on a grass apron with no fatalities.
The Boeing cracked a wing and had its wheels ripped off in the crash on Jan. 17, 2008, after frozen fuel lines stopped its engines from providing sufficient thrust as it neared Heathrow. Had the pilot not adjusted the flaps the 777 would have plowed into a cluster of antennas that communicate with the instrument landing systems of aircraft before touchdown, the AAIB said.
…“You have to take your hat off to Captain Burkill because while reducing the amount of flap helps maintain speed it also diminishes lift and it’s something you never, ever do,” said Kieran Daly, an air-safety commentator and former pilot. “So really it’s an extraordinary thing. An act of genius.”
— Prione, Sabine. “British Airways Pilot Averted Worse Crash, Study Says.” Bloomberg BusinessWeek, 9 February 2010.
Despite that good decision, the award of the BA Safety Medal (only awarded three times previously), and a later return to flying duties, Captain Burkill took voluntary redundancy and left British Airways in 2009.
Glossed over in the report was the fact that both the captain and first officer had very little sleep over the previous 24 hours. The NTSB says the captain had ‘reduced sleep opportunities’ and attempted to rest in the company crew lounge. Apparently the attempts at sleeping there weren’t effective since the captain logged on to a company computer at 3:10 in the morning.
…But one of the investigators in the Colgan accident, Robert Sumwalt refuses to allow for the possibility that fatigue was even a contributing factor in the accident, saying “…just because the crew was fatigued, that doesn’t mean it was a factor in their performance.”
…The role of fatigue was mentioned during an NTSB hearing on the Colgan accident. Board chairman Deborah Hersman argued that several issues, including the crew’s sleep deficits and the time of day the accident took place, were factors and said that fatigue was present and should be counted as a contributing factor to the crew’s performance.
But the view of board member and former USAirways pilot Robert Sumwalt prevailed. He concluded that fatigue wasn’t a factor in the accident. It didn’t stop them from detailing the role it played in Colgan 3407 (PDF LINK)
So if nicotine is found to cause some cancer, but its role in a person’s life expectancy cannot be determined, should we rule it out as a possible factor in a lung cancer death?
— Wien, Kent. “Plane Answers: NTSB glosses over fatigue in the Colgan crash.” Gadling.com, 4 February 2010.
RELATED: Kent argues convincingly that the Colgan crew was not distracted by idle chatter, since they didn’t say anything other than the usual callouts for two minutes prior to the stall condition.
As the video says, the airport infrastructure includes such on-field hazards as animals, people, an uneven runway surface, nearby houses and rapidly variable weather conditions.