Toll Free Order Line: 1-866-247-4568
Welcome to iPilot, please Sign In or Register

CHART SUBSCRIPTION

TOP PRODUCTS
WEATHER

 

If you're just starting the process or Learning to Fly or a veteran looking for an online resource to continue your education, you've come to the right place. Our expanded learning section has features for everyone!

A Deeper Observation of Pilots

In my past several articles, I have been telling you about pilot observations I have made and categories of pilot performance.  After identifying four broad pilot categories, I started to realize that there are some traits that are present across categories. I saw some sub-groups that were not tied to any of the broad categories and could show up in any category. The broad categories were...

A Deeper Observation of Pilots

In my past several articles, I have been telling you about pilot observations I have made and categories of pilot performance.  After identifying four broad pilot categories, I started to realize that there are some traits that are present across categories. I saw some sub-groups that were not tied to any of the broad categories and could show up in any category. The broad categories were:

1)     The Information Managers,

2)     the Non Assertive Decision Makers,

3)     the Snowballers, and

4)     The Lost in Space

(See previous iPilot articles for more information).

I further identified and named two such sub-groups: The Illogical Decision-Makers and the Good Decision Makers/Poor Fliers.

Illogical DECISION-MAKERS (sub group)

When faced with a decision, many pilots make an irrational or illogical decision. When faced with an alternator failure, for instance, and with limited time available for a safe landing, one pilot requested a holding pattern. The participants who show traits form this sub-group most often also displayed characteristics in common with either the Snowball Effect group or the Nonassertive Decision Makers. Therefore I consider the Illogical Decision Makers a subgroup because their illogical solution to the problem may have been their lack of assertiveness or workload saturation.

The pilots that were placed into this subgroup tended to take one of two courses of action. Either they did something good for the wrong reason, or they did the wrong thing using an illogical reason.

Good Choices for the Wrong Reason (dumb luck)

In one scenario a pilot upon hearing that the clouds at the destination airport were 300 feet, declared that the approach could not be flown and wanted to fly to an alternate airport. That particular ILS approach allowed the airplane to be flown to within 200 feet of the ground, which would have been under the clouds. The best choice, given the circumstances at that moment would have been to take the ILS approach. But that particular scenario had another problem embedded in it. If the pilot would have taken the best choice (shoot the ILS), I was going to give them a glide slope failure and that would have changed the choices again. In that situation a diversion to the alternate would become the best choice. So, this pilot did do what would eventually have proven best, but arrived at the decision prior to acquiring the information that would have labeled it as such. The pilot selected the best choice through dumb luck.

Another example of a participant making a good choice for the wrong reason came about when one pilot declared a missed approach at the first sign that there was a problem. He said, 'I was trained to go to the alternate if anything ever failed.' This impulsive reaction defies the logical question: Why would the airplane's equipment failure (and your performance with it) be any better at the alternate? This reaction did not take into consideration any of the situation-specific circumstances. In this case the participant did save time by making a missed approach early, but a one-size-fits-all solution would not work in every situation.

Poor Choice because of the Wrong Reason

The remainder of the participants within this subgroup was characterized by making decisions that placed themselves in additional jeopardy due to illogical judgements. Several of the participants would fly one approach procedure and see for themselves that it was not possible to safely land from that approach because the clouds were too low. Then, with time running out, they elected to fly the same approach again, with the same cloud base reported. The time it took to fly the approach a second and sometimes a third time, while fuel ran down, eliminated all other options that they would have had if they had acted sooner.

Illogical decisions were sometimes made simply because the pilot was overworked and could not think of anything else to do. The solutions to in-flight problems require some thought and even creativity, but many pilots are so mentally taxed by the demands of flying they do not have time to give the problem any thought. Without using deliberate thought, they react with an impulsive action. The product of a 'no-thought' action is an illogical action. (This is why the Illogical Decision-Makers were considered a subgroup of the Snowball Effect category.)

Other Pitfalls: It was also possible that the participant elected to take a no-win course of action because they were simply too timid to work out any other course of action with the controller. Many pilots showed reluctance to converse with the controller and they did not realize that ideas that they might have to solve the problem were valid. They did not seem to realize that they could be and should be in control of the situation. (For this reason the Illogical Decision-Makers were also considered a subgroup of the Nonassertive Decision-Makers category.)

What I Heard

          The conversation these pilots had with controllers became a 'window' into their faulty decision making. Here are some actual quotes:

          'I'm going around off the approach.' (This comment was made at a point when the approach had been going well and landing was a real possibility - bailing out of the approach at that point was illogical).

          'Nashville (approach) I am requesting an ASR (Airport Surveillance Radar) approach.' (The participant had just made a missed approach on a procedure that allowed the airplane to descend to within 400 feet of the ground. The ASR approach is a procedure that allowed the airplane descent to only 600 feet).

          'I was close to the airport so my alternator light is not urgent.'

          'The weather is below localizer minimums. If I miss this approach, I will return for another at Smyrna (destination airport).'

What I Wrote

My observation journals again turned up surprising trends. I wrote of the pilots that I observed:  

          'The decision making process seems to be independent of how current an instrument pilot is. Some of our pilots make good decisions yet flew the flight simulator poorly, others flew well but made poor decisions.'

          'There seems to be a universal fear of seeking help or declaring an emergency. They risk their lives rather than writing a letter or making a phone call.'

Important: When a pilot declares an emergency, the FAA will sometimes ask the pilot to explain what happened in a letter or telephone interview – but my investigation determined that this is very rare! And enforcement action on a pilot who declared an emergency almost never happens, and when it does the action has nothing to do with the emergency itself.

Next week the last of the sub-groups ... and the scariest!
 

Basic Membership Required...

Please take a moment and register on iPilot. Basic Memberships are FREE and allow you to access articles, message boards, classifieds and much more! Feel free to review our Privacy Policy before registering. Already a member? Please Sign In.

About This Author:
Paul A. Craig is a Gold Seal Multiengine and Instrument Flight Instructor. He currently holds a total of 11 Flight Certificates including his ATP. Craig is a previous winner of the North Carolina and Tennessee Flight Instructor of the Year award, the NCVT Outstanding Teacher award and has served as the regional representative of the National Air and Space Museum. Craig is an FAA Aviation Safety Counselor and the author of eight books, including Pilot In Command, The Killing Zone: How and Why Pilots Die, and Controlling Pilot Error: Situational Awareness (all from McGraw Hill).
Article options:
Article Archive
Search the database.
Add to My Ipilot
Saves this article.
Topics