Doug Campbell (RiP) Did the PFD GMHS Know or Ignore his Tried-and-True Campbell Prediction System? 3
This preferred Title replaces the Wix website size constraints version - "Doug Campbell (RiP) Did the Prescott FD Granite Mountain Hot Shots (GMHS) Even Know, Train In, or Ignore his Tried-and-True Campbell Prediction System leading up to and including their detriment on June 30, 2013? Part 3 of 4"
Views expressed to "the public at large” and "of public concern"
DISCLAIMER: Please fully read the front page of the website (link below) before reading any of the posts ( www.yarnellhillfirerevelations.com )
The authors and the blog are not responsible for misuse, reuse, recycled and cited and/or uncited copies of content within this blog by others. The content even though we are presenting it public if being reused must get written permission in doing so due to copyrighted material. Thank you.
Abbreviations used below: Wildland Firefighters (WFs) - Firefighters (FFs).
The author took the liberty of correcting some of the minor spelling, punctuation, and grammar errors in the sources provided; and also provided select links and/or hyperlinks as well throughout this post to the best of the author's abilities. They will show up in an underlined, somewhat faded appearance. Clicking on the link twice will reveal a blue link which will take you to the source after clicking on it again.
Carried over from Part One and Part Two - Doug Campbell passed away in Ojai, CA on July 13, 2021. This is a tribute to our self-avowed Politically Incorrect Hot Shot Friend and Brother, Mentor, Leader, Visionary, and Creator of the Campbell Prediction System (CPS). Doug Campbell (RiP) was truly a remarkable man with a wide range of wildland fire interests in fire behavior, leadership, and human factors; enhanced by always being grounded by family, friends, loved ones, and colleagues. Whatever he did, he did it with eagerness, enthusiasm, and enjoyment. He was always respectful and always logical. He was at ease "speaking truth to power" for the benefit of all WFs and FFs.
He taught us to think of the predicted fire behavior intuitively, in terms of logic. Doug was truly blessed with an incredibly brilliant mind, equipped to reach the highest intellect, and yet still able to identify with and relate the simple aspects of reading a wildfire's signature to discern what it was telling us. He died, much too soon. However, many of us were blessed to have attended his lectures and read and researched, and then applied his works. And this is what was to eventually become the Campbell Prediction System (CPS). We promise to pass this "Old School" work on to others. Thank you. We will miss you.
No king is saved by the multitude of an army;
A mighty man is not delivered by great strength.
A horse is a vain hope for safety;
Neither shall it deliver any by its great strength. Psalm 33: 16-17 (NKJV)
"Ninety percent of success can be boiled down to consistently doing the obvious thing for an uncommonly long period of time without convincing yourself that you're smarter than you are." Farnam Street Blog (8/8/21) "Learn faster, think better, and make smart decisions - Wisdom You Can Use"
Figure 1. Doug Campbell and CPS images. Source: Wildfire Management Tool WWEC website
Carried over from CPS post Part Two within the CPS documents - USFS Project Leader Dr. Ted Putnam's Findings From the Wildland Firefighters Human Factors Workshop - Improving Wildland Firefighter Performance Under Stressful, Risky Conditions: Toward Better Decisions on the Fireline and More Resilient Organizations (Table of Contents)
Consider now the works of several recognized "experts" in human factors. Appendix D. - Keynote Presentations. (1) Dr. Curt C. Braun Department of Psychology, University of Idaho addressing the Behavioral Bases of Accidents and Incidents: Identifying the Common Elements in Accidents and Incidents. (2) And Dr. Klein from MacroCognition LLC addressing Recognition Primed Decision Strategies. (3) And David O. Hart, TID, Inc. addressing the Cultural Attitudes and Change in High-Stress, High-Speed Teams. (4) And Dr. Karl Weick University of Michigan addressing South Canyon Revisited: Lessons from High Reliability Organizations.
First off, posted elsewhere on the YHFR website, Dr. Braun was our Human Factors expert on the "Hochderferrer Fire Shelter Deployment Investigation" on the Coconino NF in 1996. Spelled incorrectly on the Wildland Fire LLC web search result (HOCHDERFTER FIRE SHELTER DEPLOYMENT), the report, and associated documents of the fire, are included as a link in the fire name above. This link with further information is from USFS Fire Analyst Rick Stratton's Wildland Fire Library website describing it as an "event."
Note the letter content and the handwritten comment on the cover page about "the releasable portion/part of the report."
This author was the Operations Specialist. And Dr. Braun's comment was part of our initial human factors briefing. "The first thing we're going to do is establish a conclusion, then find the facts to support it." This author questioned that statement and stated that the facts should come first as they were supposed to lead us to a conclusion. Braun insisted that the conclusion would come first. "Then we can write anything we want" was this author's response. The statements in Dr. Braun's presentation belie his conclusion's first principle.
Notwithstanding the above comments, and his 2001 research with others of a similar ilk, where they basically undermine the collective intelligence of FFs and FFs seeking an alternative to the Fire Orders ( Creating and Evaluating Alternatives to the 10 Standard Fire Orders and 18 Watch-Out Situations, International Journal of Cognitive Ergonomics - 2001). And this is supposedly because there are what? Simply far too many items to remember? Dr. Braun has contributed some valid points in his presentation here below. So then, this question needs to be broached: How did all of you make it through high school and college without memorizing 43 key principles that will save your lives? Or how about all of you sports aficionados that can memorize hundreds of sports players and teams statistics?
(Heading and section emphasis are original. All other emphasis is added below. The word decision-making is unhyphenated in the original and now hyphenated throughout this YHFR post, to avoid the annoying spellcheck reminders to hyphenate the word)
Behavioral Bases of Accidents and Incidents: Identifying the Common Elements in Accidents and Incidents
Dr. Curt Braun - "Virtually every college student has faced the philosophical question, "If a tree falls in the woods and no one is there to hear it, does it make a sound?" The answer of course is no; the falling tree does not make a sound. While many people struggle with this answer, it is important to remember that the answer relies, not on the physics associated with a falling tree, but rather on the definition of sound. Sound is a subjective sensation created when the ear is stimulated by changes in the surrounding air pressure. Given this definition, a tree falling in the woods makes no sound when an ear is not present. A comparable safety question might be, "If there is a snag in the woods and there is no one there, does it pose a risk?" Again, the answer would be no. As with the sound example, the answer centers not on the physics of a falling tree, but rather on the definition of risk, a chance of loss or injury to a human. In the absence of a human, a falling snag creates no threat of injury or loss. Although this relationship appears obvious, it is important to realize that there are two components to this question: the snag, and the presence or absence of the human. Both play a role in creating a risky situation.
Here are a few sources countering the "if a tree fell in the forest" question for your consideration and perusal if you're interested in this ongoing philosophical, scientific, hypothetical argument - for hundreds of years.
February 28, 2018. Written by: Michael Bahtiarian. If a Tree Falls in a Forest: A Yes and No Answer? Acentech
K. Cornille. Apr 1, 2018, (link in author's name and date) If a Tree Falls in the Woods. It’s okay to not know what happens. Medium.com
Jim Baggott February 14, 2011, Quantum Theory: If a tree falls in the forest… Oxford Unified Press
Numerous commenters. ETHICAL CONUNDRUMS. The Guardian UK: If a tree falls in the forest and no one is there, does it still make a sound?
If an individual is injured by a falling snag, clearly both had to be present. This situation can easily be represented by the following model:
Environmental Hazard (Snag) + Human = Accident
The role of the snag and the individual in this situation are (sic) significantly different. The fact that the snag will eventually fall is well known and in contrast to the actions of the human, represents a relative constant. We know that the snag will eventually fall, but not when. If the environmental hazard remains essentially constant, only one component is left to vary: the actions of the human.
The level of risk created by the snag can be mitigated or exacerbated by the behavior of the individual. Injury and loss are more likely when the individual fails to attend to the known risks. When the individual is struck by the falling snag, the proximate cause is apparent, inattentiveness. It is not apparent, however, that this was an isolated case of inattentiveness. This inattentiveness might represent a general pattern of behavior that places the individual at risk in a variety of situations. To adequately respond to the accident, consideration must be given to both the proximate cause and the behavioral pattern. Unfortunately, traditional safety programs have placed far more emphasis on the former than on the latter.
Human Behavior and Accidents
Few will argue that most accidents and mishaps are directly related to unsafe behaviors. A review of the national air traffic control system revealed that 90% of the committed errors could be directly linked to human inattentiveness, poor judgment, or poor communications (Danaher, 1980). Mansdorf (1993) lists nine different causes of accidents and attributes all of them to human error in the form of inadequate training, supervision, and management. Given this consensus, the solution is simple; change the behavior where the accidents occur. Despite the intuitive appeal of this approach, efforts to increase safety in this manner often fail to produce the anticipated reductions in accidents. These failures occur because traditional safety programs generally focus on the unique circumstances and risks that, like the snag, remain relatively constant. Moreover, these programs often do not consider the broad spectrum of situations where the same behavior can also result in an accident.
A Training Proposal - A Training Proposal. By Wildland Fire Specialists. How to Base Actions on the Predicted Fire Behavior. We know the rule well but… Do we know how to be sure we can comply? All firefighters should be able to predict the changes in fire behavior. This is the course focusing on that objective. (Nov. 18, 2011)
Doug Campbell (RiP), Slide Serve, and Libitha. Excellent PowerPoint
Krause and Russell (1994) suggest that accidents result, not from unique circumstances or behaviors, but from the intentional display of risky common practice. (sic) These authors contend that an accident represents an unexpected result of an unsafe act that has become part of the working culture. Despite the best efforts to mandate safety, risky behaviors increasingly become acceptable practice each time they are performed without negative consequences. The process is similar to that seen in individuals who interact with hazardous products. Safety researchers have found an inverse relationship between safety behavior and familiarity (Goldhaber & deTurck, 1988). The probability that an individual will comply with safety guidelines decreases as familiarity with the product increases.
Wildland firefighters are not immune to this process. In response to the South Canyon fire of 1994, Rhoades (1994) writes, "And sometimes, even often, the risks we take in doing our jobs include violating the 10 Standard Fire Fighting Orders or ignoring the 18 Situations that Shout Watch Out." He further writes, "Nonetheless, very seldom does our inability to comply with the orders cause us to abandon our tasks..." Rhoades' statements reflect the fact that it is possible to violate standard safety practices without the worry of negative consequences. More importantly, however, Rhoades' comments suggest that the violations have occurred with such great regularity that they have become accepted practice in wildfire suppression.
Accident Prevention From a Behavioral Perspective
An effective prevention program begins by understanding that accidents often reflect the unfortunate outcome of hazardous acts that have become common practices and that these practices frequently span a multitude of different job tasks. To be effective, a safety program must: 1) identify the antecedent behaviors that result in accidents and near-miss incidents; 2) determine how frequently these behaviors occur; 3) evaluate training and management programs; 4) provide consistent and active feedback and reinforcement, and 5) develop remediation plans.
Identifying Antecedent Behaviors.
Traditional accident investigations tend to be very myopic, focusing only on the circumstances immediately involved in the accident. The purpose of an investigation is to identify the accident's cause with the aim of creating new procedures, equipment, and standards to eliminate or at least minimize the risk (Mansdorf, 1993). This investigative approach, however, must go beyond the traditional microscopic analysis to identify behaviors that are common in a variety of accidents. To facilitate the identification of these behaviors an investigation team should be composed of individuals from all levels of the workforce (Krause & Russell, 1994; Mansdorf, 1993). Moreover, efforts should be taken to reconstruct the accident with the aim of identifying the underlying behavioral patterns that might have precipitated it. Once identified the investigation needs to assess the extent to which these behaviors have been present in other incidents or accidents. Finally, the investigation must assess the degree to which the actions reflect the acceptance of hazardous and risky behavior as common practice.
Assessing the Frequency. To assess the frequency of unsafe acts, a system for reporting accidents, and near-miss accidents must be created. Near-misses play an important role in assessing the frequency of risky acts. From the behavioral perspective, near-misses represent accidents without the consequences (Krause & Russell, 1994). Moreover, given that unsafe behaviors infrequently result in accidents, near misses can provide better insight into employee safety. Mansdorf (1993) reports that for every serious industrial accident there are approximately 10 minor accidents, 30 property damage accidents, and 600 near-miss accidents.
The overarching motivation driving a reporting system should be the acquisition of reliable and valid data. To facilitate this process, the reporting system must encourage reporting from all levels of the workforce. Moreover, individuals should be instructed as to their reporting responsibilities. With regard to the logistics of the system, every reasonable effort should be taken to reduce the cost of complying with reporting requirements. These efforts might include simplifying reporting forms, the use [of] on-site or telephone based interviewers to whom unsafe acts can be reported, the use of anonymous data collection systems, the creation of safety surveys, the use of trained field observers, or the use of automated data collection systems. Such reporting programs might also guarantee immunity from disciplinary actions for individuals who report.
Evaluating Training and Management. There are a variety of questions that must be asked when evaluating training and management. Are instances of the desired behavior demonstrated during training? For example, fire shelter training has traditionally placed more emphasis on getting into the shelter than on other factors such as situational awareness, site evaluation, ground preparation, and contingencies all of which are essential to a successful shelter deployment. Are employees trained in the selection of the appropriate behavior? Invariably more than one option is available for each situation. In a situation where a burnover is inevitable, a firefighter can deploy a fire shelter or attempt to escape. Factors that influence this decision-making process must be considered in advance. Training should include techniques and procedures used to evaluate the various options. Is there a system to continue training apart from the classroom? On-the-job training (OJT) is a widely used technique but it suffers from many shortcomings. Trainers are frequently unaware of instructional techniques, training occurs only when time is made available, the situation typically dictates what skills are learned, and trainees often take a passive role merely watching and not demonstrating behavior (Gordon, 1994). Managers and supervisors must assess the extent to which training relies on OJT and develop specific programs to maximize its usefulness.
After training, are the behaviors practiced? Just as firefighters exercise to maintain a level of physical fitness, skills learned in training must be practiced to ensure competency. In a recent article on decision-making in the fire environment Braun and Latapie (1995) noted that training should include the rehearsal of behaviors that are needed in stressful conditions. Safety critical behaviors must be practiced until they become automatic. Finally, what is the perceived priority of safety? Do supervisors and managers expect safe behaviors? Are firefighters asked to work in high-risk conditions that are outside of safe parameters? Is there an established code of conduct that specifies the safe behaviors an individual is expected to display? Finally, is there an accountability system to which all firefighters are held? The answers to these and other questions provide an indication of the priority safety is given
Feedback and Reinforcement.
The concepts of training and reinforcement are closely related. At its most basic level, training serves to educate an individual about the various reinforcement contingencies (Anderson, 1995). That is, during training an individual learns the actions and behaviors that will be reinforced when training is complete. After training is complete, are the trained behaviors expected and reinforced? Moreover, have the trained behaviors been directly or indirectly extinguished by example or directive? For example, are firefighters more often reinforced for taking risks than for demonstrating good judgment?
While it is important to assess if trained behaviors have been reinforced, it is just as important to determine if unsafe behaviors have been inappropriately reinforced by environmental events. Although the ultimate goal of firefighting is fire suppression, a suppressed fire is not an appropriate reinforcer for firefighting behavior. This unsuitability stems from the fact that all fires eventually go out independent of the actions taken by firefighters. This inevitability makes fire suppression an indiscriminate reinforcer. That is, fire suppression could reinforce both safe and unsafe behaviors. Some would agree that factors such as weather often play a larger role in suppression than firefighters, but still argue that firefighters should be reinforced by the fact that the size of the fire has been limited. There might be some truth in this statement, however, it is not completely verifiable because firefighters often take advantage of areas where the fire would stop on its own (e.g., natural fuel breaks).
Care should be taken in determining the types of reinforcement and feedback individuals obtain from the environment. The containment and suppression of fires, the saving of structures and resources, and other similar events make poor reinforcers because they are indiscriminate and because they target the outcome of behavior and not the behavior itself. Efforts must be made to reinforce the safe behaviors independent of the outcomes.
Remediation Plans. Shortcomings in training, supervision, or management should not be viewed in isolation but as representative of a company-wide pattern of behaviors. Efforts to remediate these shortcomings must endeavor to address both the specific behaviors and the broader culture. Each plan should identify short-term and long-term objectives and the criteria against which the plan will be evaluated.
Programs aimed at enhancing safety by addressing the proximate cause of an accident only consider a small portion of the safety picture. Merely addressing the proximate cause fails to consider that the system either directly or indirectly trains, reinforces, and even expects employees to demonstrate hazardous behavior. An effective safety program must consider both the proximate cause and the working environment that promotes hazardous behavior. The program must identify unsafe behaviors and assess their prevalence. It must evaluate training to ensure that individuals not only gain the necessary skills but are provided with opportunities to exercise and practice those skills. The safety program must survey supervisors and managers to determine if skills learned in training are actively reinforced, and finally, it must make recommendations that affect behaviors and the system that supports them.
References - included within the link above in order to save space
Gary Klein PhD Naturalistic Decision Making and Wildland Firefighting
Gary Klein, Ph.D., is a Senior Scientist at MacroCognition LLC. Dr. Klein received his Ph.D. in experimental psychology from the University of Pittsburgh in 1969. He spent the first phase of his career in academia as an Assistant Professor of Psychology at Oakland University (1970-1974). The second phase was spent working for the government as a research psychologist for the U.S. Air Force (1974-1978). The third phase began in 1978 when he founded his own R&D company, Klein Associates, which grew to 37 people by the time it was acquired by Applied Research Associates (ARA) in 2005 Dr. Klein has developed several models of cognitive processes: A Recognition-Primed Decision (RPD) model to describe how people actually make decisions in natural settings; a Data/Frame model of sensemaking; a Management by Discovery model of planning to handle wicked problems; and a Triple-Path model of insight.MacroCognition LLC.
"The Recognition-Primed Decision Model [RPDM] describes what people actually do when they make difficult decisions. This has many implications for training and helping people make decisions under stressful situations. It can also help explain the factors behind bad decisions.
The standard method of decision-making is the rational choice model. Under this model, the decision-maker generates a range of options and a set of criteria for evaluating each option, assigns weights to the criteria, rates each option, and calculates which option is best. This is a general, comprehensive, and quantitative model which can be applied reliably to many situations. Unfortunately, this model is impractical. People making decisions under time pressure, such as fire fighters, don't have the time or information to generate options and the criteria to rate each option.
The rational choice model is also too general. It fits each situation vaguely, but no situation exactly. The worst news is that in studies in which people have been asked to follow the rational choice model exactly, the decisions they come up with have been worse than decisions they make when they simply use their own experience base. This model is of little value to training because it does not apply to most naturalistic settings or to how people actually make decisions when faced with complex situations under time pressure. Decision aids which have been produced to assist with the application of the rational choice model have been largely ineffective. Because of these drawbacks, a field emerged called Naturalistic Decision Making (see Table 1 [Figure 2.]). This field emerged because governmental sponsors such as NASA, FAA, the military, and others, realized that they had spent a lot of money and built decision models that did not work in the field. They wanted to get away from building analytical models which didn't work when they were brought into action. Naturalistic Decision Making uses expert decision-makers, and tries to find out what they actually go through in their decision-making process.
Figure 2. pros and cons bullet points discussed in his presentation Source: Klein
Instead of restricting decision-making to the "moment of choice," experts are asked about planning, situational awareness, and problem solving to find out how these all fit together. This model is used to understand how people face decisions in shifting and unclear situations and under high stakes. Team interactions and organizational constraints with high stakes are also used as factors. For years, researchers had been simply asking college sophomores what they would do given a set of options, and a clear goal. For Naturalistic Decision making research, experts are asked to size up actual situations, using all cues and constraints to set goals and make decisions.
The first study I performed to generate models and training recommendations for decision-making under pressure and certainty was a study for the Army. The Army Research Institute wanted some data on decision-making in real, stressful situations, and I thought that urban firefighters would be a good example of people who had become experts at making such decisions. We studied commanders who had about 20 years of experience, and studied the most difficult cases they had. Of the cases we studied, there was an average of five changes in the fire and in the way it had to be handled. About 80 percent of the decisions were made in less than a minute. As we started the study, we found that each expert firefighter told us that they had never made any decisions. They explained that they simply followed procedures. But as we listened, we realized that in each case, there was one option which they thought of quickly. They evaluated that one option, and if it seemed viable, they went ahead with it.
We began to wonder how they came up with that first option and how they were able to evaluate one option without others for comparison. The strategy used by the firefighters is the basis for the Recognition-Primed Decision (RPD) Model (see Figure 1 [Figure 2]). The first level consists of a simple match, where decision-makers experience a situation and match it to a typical situation with which they already have experience. Because of this, they know what to expect. They know what's going to happen, they know what the relevant cues are, what the plausible goals are, and a typical action. They are able to do all of this because of their experience base. Experience buys them the ability to size up a situation and know what is going on and how to react. That's what decision researchers weren't learning when they studied college sophomores who didn't have an experience base.
Figure 3. [Klein Figure 1.] —Recognition-Primed Decision model. Source: Klein
An example of the first level of the RPD model is a firefighter I interviewed early in the process. He explained to me that he never made decisions. After trying to press him on the issue, I asked him to describe the last fire he was in. He told a story of a fairly conventional fire. He described parking the truck, getting out his hoses, and going into the house. I asked him why he went into the house instead of simply working from outside, as I would have been tempted to do. He explained that he obviously had to go in because if he attached it from the outside, he would just spread it deeper inside the house. He took into account the nature of the fire, the distance of the house from other buildings, and the structure of the house. But, even while he was attending to these conditions, he never saw himself as making a decision. He never experienced that there was another option. He immediately saw what needed to be done and did it.
The second level of the model includes diagnosing the situation. On this level, expectancies are violated. The firefighter is trying to build a story to diagnose the event, and when evidence doesn't fit the story, the firefighter has to come up with a new scenario which fits the new evidence. There is still no comparing of options.
On the third level, decision-makers evaluate the course of action they have chosen. Originally, we weren't sure how people could evaluate single options if they had no other options to compare it to. As we looked through the materials we were getting, we found that a decision-maker would evaluate an option by playing it out in his/her head. If it worked, they would do it, if it didn't, they would modify it, and if modifications failed, they would throw it out. In the incidents we studied, commanders simply generated each option and then evaluated it for viability. Usually[,] the first option an experienced firefighter generated was a viable option, but they also understand that they should simply be satisfying, not optimizing. They will not necessarily pick the best option. They will pick the first one which is possible and involves minimal risk. The first viable option is chosen and improved upon, if necessary. It is not compared with all other options to see which one will be best. As soon as it is deemed viable, it is chosen and applied.
Naturalistic Decision-Making has implications for training. Decision training needs to teach people to deal with ambiguous, confusing situations, with time stress and conflicting information. Situation awareness, pattern matching, cue learning, and typical cases and anomalies can be taught by giving people a bigger experience base. Training could teach decision-makers how to construct effective mental models and time horizons and how to manage under conditions of uncertainty and time pressure.
Methods for providing better training include changes in such things as ways of designing training scenarios. Another strategy is to provide cognitive feedback within After-Action Reviews. This would do more than point out the mistakes which were made in an exercise. It would be an attempt to show decision-makers what went wrong with their size up, and why. Another method would include cognitive modeling and showing expert/novice contrasts. This would be done by allowing novice decision- makers to watch experts. Novice decision-makers would also benefit by learning about common decision failures. On-the-Job Training should be emphasized rather than simply assuming that once the traditional training is finished, decision-makers are ready to begin to function proficiently. Test and evaluation techniques and training device specification could also be improved. All of these might have an effect on the ability of firefighters to deal with stressful situations.
Why is it that people do make bad decisions? I looked through a database of decisions to identify reasons behind bad decisions. We came up with 25 decisions which were labeled as poor. Of those, three main reasons for bad decisions emerged. By far, the most prominent reason was lack of experience. A smaller number of poor decisions were due to a lack of timely information. The third factor was a de minimus explanation. In this situation, the decision-maker misinterprets the situation, all the information is available, but the decision-maker finds ways to explain each clue away, and persists in the mistaken belief.
The problem of lack of experience has many effects (see Figure 2 [Figure 3.]). Inexperienced decision-makers lack the understanding of situations to be able to see problems and judge the urgency of a situation, and properly judge the feasibility of a course of action. These are skills which could be developed to improve decision-making
Figure 4 - NDM factors that contribute and lead to poor decision outcomes. Source: Klein
The field of Naturalistic Decision Making research is more appropriate than traditional decision-making models for understanding how crisis managers, such as firefighters, handle difficult conditions such as time pressure and uncertainty. We have broadened our focus from the moment of choice, to take into account situation awareness, planning, and problem solving. By so doing, we have gained a stronger vantage point for understanding errors and for designing training interventions."
The author and other experienced WFs and FFs find Dr. Klein's work quite compelling. Training the military, he sought out the firefighting realm to find his answers and research focus. Experience, following procedure, and implied intuition. "About 80 percent of the decisions were made in less than a minute. As we started the study, we found that each expert firefighter told us that they had never made any decisions. They explained that they simply followed procedures. But as we listened, we realized that in each case, there was one option which they thought of quickly. They evaluated that one option, and if it seemed viable, they went ahead with it."
David O. Hart, TID, Inc. (Aurora, CO consulting company specializing in the delivery and development of Crew Resource Management (CRM) and human factors training)
What is -?
As we saw in the other presentations, there are a variety of ways to model decision-making. The importance here is that it can be modeled, described, and examined. By examining decision-making as a system, we can learn how attitudes, individual and cultural, affect the quality of our decisions.
There are as many decision-making definitions as there are models. For this discussion[,] we'll need to have a common reference to work from when talking about decision-making. Also, because we are talking about organization and team decision-making, we'll focus the following definitions in that direction. A definition of decision-making to keep in mind during this discussion is:
The process "of reaching a decision undertaken by interdependent individuals to achieve a common goal. What distinguishes team decision-making is the existence of more than one information source and task perspective that must be combined to reach a decision."
Close examination of this definition reveals many important aspects of the decision-making process in high-stress environments. These include, but are not limited to:
 Ill-structured problems
 Uncertain, dynamic environments
 Shifting, ill-defined, or competing goals
 Action-feedback loops
 Time stress
 High stakes
 Multiple players
 Organizational goals and norms
All these factors affect how well the decision-making machine works. If you think back, you've probably encountered most (if not all) of these factors during fire fighting operations.
[Decision-Making] DM and Attitudes
In this discussion, the factors we'll be concerned with are those that relate to and affect cultural attitudes. In general, attitudes that enhance the DM process are seen as positive, and those that act as barriers to effective DM as negative. Many attitudes have both positive and negative effects. All this may seem intuitively obvious to even the most casual observer, but it is important to establish a common ground before we delve too deeply into this subject. In the spirit of "crawl, walk, run" we'll need to first understand how attitudes affect the individual before we can understand the impacts of cultural attitudes on an organization.
Attitudes and the Individual
Before we go too much further, we'll need another definition. This time we'll be defining attitudes.
The American Heritage Dictionary of the English Language defines attitudes as: "a state of mind or a feeling; disposition." A longer definition is: "An enduring organizational, motivational, emotional cognitive process with respect to some aspect of the individual's world. Attitudes and beliefs imbued with emotional and motivational properties." Another shorter definition, is: "Affect for or against a psychological object."
They all say the same thing—an attitude is how you feel about something. Now that we know what attitudes are, let's see where they come from.
Generally, your experience forms, has an effect on, or shapes your attitudes. Some attitudes may last only minutes, others a lifetime. Another way of looking at it is to say that your attitudes come from your values and goals (remember those DM factors). So the attitudes you use as firefighters come from your training and experience as firefighters.
What Do We Do with Attitudes
Attitudes help us make sense out of our surroundings and allow us to build and maintain our Situation Awareness (SA). How? By providing each of us a set of rules and guidelines we use to gather and process information. Therefore, attitudes aid in our decision-making by framing and shaping the information we use to make our decisions. You could almost say that attitudes are imbedded (sic) in every aspect of decision-making. Good, bad, or indifferent, attitudes affect the quality of our decisions.
On a team, the synergy that develops can compensate for attitudinal failures or barriers in one of its members. Effective teams recognize attitude problems and find ways to work around the "attitudinal outages". A good example of this is the issue of women as crewmembers in combat aircraft. Many male aircrew have a real "attitude" about women in the cockpit. Probable fallout from this barrier is reduced communication, increased stress, conflict, with a resulting loss of efficiency and effectiveness. A good team will recognize the barrier and react by:
 Increasing communication to and around the affected people
 Closely evaluating the information sent by the affected parties to weed out any attitude biases
 Finding ways to reduce stress (knowing military crewmembers, humor would be a likely choice)
 Defusing any conflict before it engages the entire crew.
We've looked at the what, how, and why questions regarding attitudes and the individual, and even looked briefly at a possible individual attitude outage scenario and the team's possible response. Now let's turn our focus to teams.
Attitudes and the Team
Cultural attitudes—what are they, and why are they different? As to what they are, our definition is still valid, but with this added: the attitude is shared by every member of the organization. Organizations and teams use attitudes for the same purpose as individuals, to build and maintain their knowledge of the environment. The big difference is that the synergistic effect of teams magnifies and multiplies the effect of attitudes.
The multiplication and magnification cuts both ways. Positive attitudes provide a uniform strength and negative attitudes, uniform weaknesses. An example of a positive effect is providing baseline goals, values, and priorities (once again, remember the DM factors), to establish a cohesive team more easily and quickly. Failures are much more insidious.
When an attitude fails (e.g., is no longer valid) or is working against a team, it becomes an attitudinal "blind spot." Because everyone in the team and/or organization possesses the attitude, no one can perceive that there is a problem—there is nothing to compare it against. For example, the team has an attitude barrier that inhibits communication. By reducing the amount of information flow, and possibly, information quality, there can be a substantial loss of synergy, cohesiveness, leadership, recognition, awareness, and communication. All these elements, working at full capacity, are crucial to effective decision-making.
It is important to note that despite these undesirable results, critiquing and correcting the failure is difficult because you can't "see" the cause.
Where Attitudes Come From
We've already determined that an individual's attitudes come from his or her values and goals. The same holds true for any organization. The cultural attitudes grow out of the organization's values and goals. The source for these attitudes can be either internal or external to the organization.
Internal sources are the easiest to identify. Policy statements, directives, and even official memos are examples of how organizational goals and values manifest themselves.
Looking to the South Canyon Fire (SCF) incident, the Grand Junction District Management Team directive that all fires be "initial attacked and suppressed as soon as possible" is an example of policy working as a cultural attitude. What you gain from this attitude is a concrete direction for the firefighting teams. The goals of their decisions are unambiguous. On the flip-side, this attitude can become a decisional oneway road. It doesn't provide a way out of a fire that cannot be suppressed. Also[,] the added emphasis on mission accomplishment can come into direct conflict with existing safety attitudes.
The "can do" attitude identified in the SCF investigation report is common to many [high-stress], [high-speed] teams. It helps build team cohesion, which is important to the team for synthesizing information and integrating the individual perceptions of the situation into a common perception. But taken too far, this attitude can have lethal consequences. By going above and beyond to complete the job, mission success is prioritized ahead of safety. We see this in the report where the "can-do" attitude is attributed with the compromise of the 10 Standard Firefighting Orders (SFOs) and 18 Watch Out Situations (WOSs).
In 1994, proximate to the SCF debacle and alleged "investigation," through the renowned "Hot Shot Network," this author discovered that the "can do" attitude was more insidious and had no positive connotations at all. In contrast, it was instead a serious, much more aggressive, and competitive attitude thing between the Hot Shots and the Smokejumpers - each feeling that they were somehow better than the other.
When there is a disconnect between training and experience, a barrier to effective decision-making exists. This disconnect causes a gap between the individual and resulting team perception of reality and actual reality. This example is more ambiguous than the previous two, but when seen in an actual example, it leaps right out at you. The SCF report found that "some firefighters failed to recognize the capability and limitations of the fire shelters and deployment sites." And "some questioned the value of the fire shelters under any conditions and may not have been carrying shelters." It is apparent that the training received was not supported or validated by the experience of the cited firefighters. This kind of gap between perception and reality can, and has produced, deadly results.
The final internal example is the attitude or sense of being part of a larger "family." This is most often seen as an elitist attitude. In this case[,] we use elitist to mean special, different, or set apart. It is often expressed with the statements "we watch out for our own," or "we take care of our own." This increased awareness of your team members translates into an increased safety awareness. Carried to an extreme, it can result in a lack of leadership. The B-52 bomber crash at Fairchild AFB in Spokane was allowed to happen because the commanders at the base failed to ground the pilot for flying the aircraft outside its operational limits because, he was "one of our own," and for fear of "ruining his career."
For external sources of organizational attitudes, we'll look at two particular to firefighting, and one common to the entire federal government.
Pressure from the public and media generates the attitude that fires with the most public attention should be attacked first. Normally, being responsive to the needs of your customer is seen as a positive goal and attitude. But by allowing people outside the organization to control priorities, you end up with shifting, ill-defined, or competing goals (sound familiar?).
The harsh spotlight of the news media can have a similar effect. An organization is usually highlighted because of some failure or near-failure. The organization usually responds by reacting with abrupt changes in goals and values, then attitudes, then decisions. In the case of the SCF, the reaction was increased emphasis on safety, but unless the spotlight is on something that needs to be changed, the resulting changes may not be for the good of the organization.
The last external example is one that everyone connected with the federal government, most state governments, and some corporations have felt: "do more with less". In a perfect world[,] this would allow organizations to get the most from their resources. Unfortunately, we don't live in a perfect world. In reality, this attitude is a time bomb just waiting to go off.
"Do more with less" pushes people and equipment to perform beyond their capabilities, usually by sacrificing the normally accepted margins of safety. It usually takes a catastrophe many times worse than the SCF for the federal leadership, from Congress on down through each agency involved in the concerned operation, (in this case wildfire fighting) to realize that you do less with less. Adopting a "do less with less" attitude would mean letting some fires burn themselves out when they don't directly threaten the local populace. Unfortunately[,] decisions like these usually come at an immeasurable cost
Attitudes, Training, and Experience
Attitudes, training, and experience have a deeply interrelated relationship. Cultural attitudes affect the emphasis of training, and experience shapes and modifies our attitudes. When experience and training validate each other, there is usually a positive attitude effect. When they don't support each other, there's usually a negative attitude effect.
Start with the training attitude that by emphasizing fire behavior, fuels, weather, and tactics, entrapments will be avoided. Add to that the historically low frequency of losses, an experience based invulnerability attitude (i.e. "it won't happen to me") can develop. The overall experience, expertise, and success of firefighters fosters the attitude that they can handle any fire (i.e. elitist, can do, or 10:00 fire), which in turn feeds the training and experience attitude "why should we over-learn emergency procedures (fire shelter use and bailing out of a situation). From this vantage point, it would appear that these attitudes are leading firefighters to lean on luck and circumstance to keep them safe.
The combination of low frequency of losses (experience), and highly experienced teams (experience) conspire to subvert important safety procedures and attitudes (training).
Attitude Impacts on SCF
Cultural attitudes played a significant [role] at South Canyon. Some of the cultural attitudes that were carried into the fire were:
 "All fires will be initial attacked and suppressed as soon as possible."
 "Highest priority fires are ones that threaten life, residences, structures, and utilities."
 "We can handle the fire."
 "Can do"
 "It won't happen to me."
This last attitude is a training/experience trap stemming from the fire training attitude and the fire shelter attitude.
What impact did these attitudes have on the incident? First, we need to recognize that safety and operational effectiveness are opposite sides of the same coin. The first Standard Fire Order supports this. At South Canyon, the additional emphasis suppression received was both caused by and resulted in the erosion of safety margins. Each time the firefighters "got away with" pushing into their safety margins to suppress a fire, it reinforced the attitude that they could do the job with a smaller margin for error. The fact that some of the firefighters were uncomfortable with the situation at South Canyon demonstrates that Grand Junction's suppression directive was causing some shifting and competing goals. This erosion of the safety attitude coupled with SA and communication breakdowns critically compromised the team and individual decision-making ability. Among the elements that led to this breakdown are physical and mental fatigue, recognition gaps, weather information not communicated or used, safety concerns not communicated, concerns about who was in charge (leadership)[,] and the numerous compromises of the SFOs and WOSs. When the blow-up occurred, these came together with deadly results. The attitudes also blocked the last escape path—dropping tools and packs, bugging out, and using shelters.
After situations like these many questions are raised. Some that need to be answered in order to affect any kind of change are:
 Do tactical teams know to increase meaningful communication during a crisis?
 Also, do they know how to communicate effectively?
 What about pre-planning for crisis situations?
 Do tactical teams get the best information before and during a fire crisis?
Changing Cultural Attitudes
Before we look at examples of how these changes are affected, let's look at why that change is made.
Why do cultural attitudes change? Because it is recognized that the long term goals of the organization are not being met.
How do you recognize that an attitude is no longer valid? Since you have no " attitude out" light, you usually know by unwanted results produced by practicing the behavior associated with the attitude. The feedback from the environment may be obvious or subtle. Because of the blind spot effect talked about earlier, it is harder for teams to find the offending attitude than for individuals. Organizations, being larger and more complex than their component teams, find it more difficult digging out an invalid attitude.
Why is it harder for an organization to change an attitude? You have many more people needing to change and change is naturally difficult for people. Because they're doing something new and different, it take[s] times and effort to make it stick. Let's look at a typical process by which organizations can change attitudes. Then we'll look to commercial and military aviation as examples of organizations that have undertaken this kind of change.
Preliminary Requirements. Before the change process can be started, the organization, in particular the senior leadership and managers, needs to recognize that their greatest contribution to this sort of change is providing a supportive environment that will foster the growth of the change effort.
Patience, perseverance, and commitment from the leadership and managers [are] absolutely necessary. Recognizing that this sort of change happens one person at a time and that it will be slow and sometimes difficult, they will be supporting the change and their own role in the effort.
For the individual, making the change can be as simple as changing the behavior associated with the attitude. This can happen very quickly, but may not have a lasting effect. As soon as the need for the change has passed, the individual is likely to revert to old behavior patterns and start the cycle all over again. Actually changing the attitude is more difficult than changing the behavior. It takes more time, but has a more permanent effect. For an organization, the time and effort is greatly magnified.
Commitment, or lack thereof, will either make or break this type of program.
What needs to be changed? Initially[,] a survey of the organization should be conducted to determine the attitudes and values regarding team effectiveness. Areas that are typically covered in this type of survey are leadership, communication, recognition and management of stress, needs for achievement, and job satisfaction. For accurate data to be gathered the need for anonymity is essential. In addition[,] a cross-section of the entire organization, top to bottom, left to right, needs to be sampled to prevent inaccurate, misleading, and skewed data. This information is then used as a benchmark to measure the change against, and to help determine the types of tools [ ] necessary to make the change.
How does it happen? Using the data from the survey, a program of change is developed. Usually[,] this takes the form of training or organizational interventions. The program is usually developed by or in conjunction with professionals involved in this arena. Credibility of the developers, program, and delivery personnel is critical to the program's success. This is the first step in assuring the buy-in of the front-line teams.
Finally, programs should be designed to fit seamlessly into the culture. It can't be seen as one time fix or just another training requirement. To change the culture, it must be part of the culture.
Where does it start? Programs which work to improve team attitudes and effectiveness usually consist of a number of inter-connected training modules.
Initial "awareness" training is designed to introduce the program and set the stage for the training to follow. It is usually directed at all organizational members who are targeted for change.
A leadership/management "staff" course for the senior management is also conducted in the initial phases. These programs provide management personnel the essentials to fulfill their role in the change process. They need to "walk the talk" if they expect the rest of the organization to do the same.
Baseline training is the longest and most in-depth phase. It provides the background, vocabulary, skills, and feedback the teams need to affect this change.
Instructors and Evaluators play a special role and therefore need special training. This type of training is focused on observing, instructing, and evaluating the new attitude.
Finally, continuation training provides ongoing reinforcement of the concepts learned in the baseline training. For the best results, it should be practiced in an environment as close to actual as possible.
As with support, training must also run from the top down. No one is exempt from training, no matter what their standing in the organization. Each phase builds on the previous. This continuity is necessary so that previous training isn't invalidated by the next phase. The training that is the most important is usually the most neglected.
Instructor/evaluator and continuation training are probably the two most critical modules for assuring long-term success. The instructors and evaluators must embrace the change and its concepts and procedures, or the training will be useless. Lack of buy-in from the instructors and evaluators can result in training invalidating training, and evaluation invalidating or ignoring training.
Continuation training, on the other hand, keeps the ball rolling. Remember this is a long term program, not a quick fix Band-Aid. These concepts and skills need to be revisited not just annually, but at every training opportunity if it is going to be a permanent part of the culture. As with anything new, practice, practice, practice makes perfect. One final, important point regarding continuation training— Keep it Fresh!! Nothing will kill a program faster than tired, overused training material. As new information becomes available it should be integrated into the program.
Looking back...we see that this is just a sample of what a program for cultural change could look like. A real program is much more complex, but then again, real change is much more challenging.
Other organizations have undertaken to change attitudes within their culture. Most notable is the aviation community. We'll look now at commercial and military aviation to see what brought them there and what they've done and gained.
In the Beginning...
The 1970s saw a number of air carrier crashes. The fact that aircraft crash wasn't new, but the reasons for crashing were. More and more accidents were being attributed to "human" or "pilot error." Highly experienced, trained, and motivated (sound familiar?) crews were allowing aircraft to crash. Most notable is the Portland DC-8 crash where the crew flew the aircraft out of gas while troubleshooting a gear problem on a clear night within sight of their destination. Another is the L-1011 that slowly descended into Florida everglades as the crew tried to decide what was wrong with a 68¢ lightbulb. The crew was focused on the lightbulb and no one was minding the store: why?
The "why?" questions were asked by the airlines also. Human error was the answer—but how do you keep it from happening? This answer took the form of Cockpit Resource Management (CRM).
A program for change was initiated at a number of airlines. It probably looked like the program we just outlined. What they found was that certain elements in the human equation needed change. They were, and are, communication, stress management, leadership, decision-making, and attitudes. These programs are designed to make the pilots and flight engineers more effective and efficient flight crews.
As the programs became more and more a part of the airline culture, the benefits of this type of training [were] seen in other areas within the community. They also started seeing some return on their investment.
A notable (but not isolated) case is the Sioux City DC-10 crash. Enroute to their destination, the #2 engine, the one in the vertical stabilizer, disintegrated. Pieces of the engine cut through the hydraulic lines for the primary, secondary, and standby systems. Without hydraulic power, the pilots were unable to control any of the flight control surfaces. By all rights, the aircraft should have crashed, killing everyone aboard. That's what the engineers at the airline and aircraft manufacturer said. But Capt. Al Haynes attributes his and the passengers' survival directly to CRM. The open, continuous communication, creative synergy, and their recognizing and using all available resources are principles at the heart of CRM, and were the ones used successfully by the flight crew.
Increased focus on and awareness of effective and efficient flight operations helped to broaden the scope of the program. The first to be brought into the fold were the cabin crew, hence the name change to Crew Resource Management. Then it spread to the maintenance organizations.
In the early to mid-1980's military aviation became aware of the benefits of CRM. The USAF Military Airlift Command (MAC) was the first to come on board. Their operations were the closest to the airlines, so it was natural for them to see the benefits first. MAC " spun" the airline programs to better fit their environment. The military was interested in the effectiveness and efficiency issues, but were more interested in CRM's major by-product: SAFETY. In an environment where your enemy is actively trying to reduce you to an aluminum rain shower, a program that keeps you from doing your enemies' job is always attractive!
Today, CRM is an inseparable part of the airline culture. Human factors related accident rates are down, incidents are down, safety is up and so is efficiency. The program is working.
As for the military, the change is still taking root. Military CRM hasn't reached the stage the airlines have, but then as we have said, these things take time. It has also moved out of the aircraft arena. Other military units are seeing the benefits of CRM. Maintenance, test engineers and pilots, and special forces units are just a few that have embraced the concepts of CRM.
Changing a cultural attitude can be a daunting process. But in this environment, as in some of the others we've talked about, ignoring an attitude that is in conflict with the organization's goals and values is not just inconvenient, it's downright lethal.
By believing that what you're doing is important, you will be able to make the changes in your culture. These changes will have far-reaching benefits for the individual and the organization in safety, decision-making, and operational effectiveness."
Notwithstanding the fact that Mr. Hart is otherwise absent from the field of "experts" listed elsewhere in the wildland fire human factors literature, the authors considered that Mr. Hart did an exceptional job addressing the human factors, leadership, and Crew Resource Management fields.
Appendix D—Keynote Presentations
South Canyon Revisited: Lessons from High Reliability Organizations
Karl E. Weick, University of Michigan
"In this paper[,] I want to explore the idea that organizing to prevent wildland fire disasters such as the South Canyon Fire on July 6, 1994[,] in which 14 people lost their lives, is an ongoing struggle for alertness. My intention is to look more closely at that struggle. I want to do 4 things. First, I want to discuss 4 pieces of my earlier analysis of the Mann Gulch fire that seem relevant to South Canyon. In particular, I want to discuss briefings, leadership, tools, and wisdom.
Second, I want to discuss organizational issues at South Canyon that are less visible in Mann Gulch. These include discrepancies, levels of experience, the will to communicate, and Watch Outs involving management. Third, I want to touch on solutions. And I want to conclude by discussing some questions about South Canyon that continue to haunt me.
Consider now the Mann Gulch Fire idealized image (Figure 5. ) and Dr. Putnam's (Figure 6. ) amazing panoramic photo of the South Canyon Fire will assist in a clearer understanding of Dr. Weick's paper below.
Figure 5 (left). Idealized image of 1949 Mann Gulch Fire Source: FireRescue1
Figure 6. (right) Colorado Fire Camp "Figure 42—Panorama of the South Canyon Fire site taken 2 years after the fire from the ridge directly west across the West Drainage from the West Flank (Photo is a composite of two photographs taken by T. Putnam)." Source: Colorado Fire Camp
Similarities Between Mann Gulch and South Canyon
Briefings. The struggle for alertness at Mann Gulch was undermined by many of the same things that undermined it at South Canyon, one of which is briefings. Briefings are an attempt to give people in a crew a common framework in advance including assumptions about what they may face, how it will develop, and how the crew will function and update its understanding of what is going on.
At Mann Gulch, the crew of 14 essentially proceeded without much of a briefing. They basically knew only that they were jumping on a fire that would likely be out by 10:00 the next morning. After landing, all some of them knew was that Dodge had scouted the fire on the South slope with Harrison, had used the phrase "death trap" to describe what he found, and had ordered the second-in-command William Hellman to march the crew down the North slope toward the Missouri River. Dodge didn't say whether this tactic was to escape the death trap or to position the crew to fight the fire, or simply to get closer to the river. When the fire spotted to the North side of the gulch, Dodge turned the crew around and angled them up toward the ridge, and soon ordered them to drop their tools, and then to enter an escape fire, all without verbalizing his reasons (Dodge, 1949, p. 121). Since the crew did not know each other well, since Dodge knew only 3 of them, since several were on their first jump, and since Dodge himself was rusty on leading a crew (Maclean, 1992, p. 41), it was imperative to build some common understanding and common action into this assortment of strangers. That didn't happen.
But neither did it happen 45 years later at South Canyon. The South Canyon accident investigation team allocated almost a full page (Report of the South Canyon Fire Investigation Team, 1994, p. 26: hereafter referred to simply as Report, 1994) of their report to "Safety briefings" as a "significant contribution" to the 14 deaths. The hand-off of the fire the evening of July 5 from the BLM crew to the smokejumpers and Jumper[-in]-charge Mackey is a good example of how not to brief people. The hand-off is by radio rather than face to face, is made after the BLM crew who know the terrain and foliage has left the scene, and the jumpers inherit a handline which is partially constructed but already lost by the time they collect their gear and are ready to extend it. Without checking whether the assumption is correct or not, the departing Incident Commander says in his statement, "I knew (ia.) that Mackey would look (sic) at fire from the air before they jumped and that he would make a decision on what to do with it after we left. I did not feel that smokejumpers needed additional guidance" (Report, 1994, p. A 5-9). Mackey got off to a bad start, and the quality of the briefings didn't improve much from then on. For example, the Prineville Hot Shots were not told how Gambel Oak burns when it is dry, nor were they told that in previous days, fires had made spectacular runs through this material in Colorado.
Why so much casualness? One possibility is that everyone seriously underestimated how much continuing effort and shared information it takes to build coordination and hold it together, especially during transitions from an initial attack to an extended initial attack, from one level of complexity to another level, and from one organization to another. The investigation team, on p. 6, states the following: "as is typical in extended attack situations, firefighting groups arrived on the fire at intervals from dispersed locations and blended into the existing organization." The key word there is "blended." Blending sounds like something that occurs automatically not something that people work at. Many would say it's especially hard to blend into an "existing organization" if that organization itself is invisible, as was the case for some people at South Canyon. Some people trying to blend did not know who the Incident Commander was, or which radio traffic had the force of authority, or what the suppression strategy was since it seemed counter-intuitive.
The questions that need to be pursued are, why does briefing continue to be treated casually and what does better briefing sound like? Back in 1949, during the investigation of Mann Gulch, Henry Thol's father understood the essentials of a briefing even if much of his emotional testimony ("I owe this to my boy", p. 201) was tough to follow. "Usually the foreman he always looked out for all, to take care of anything that happened. We always looked out for that before he put the men on the fire line. He had something to fall back on . . . let's go in there boys, the wind isn't blowing now. We'll go in there. But watch out, the wind can change any moment" (Thol, 1949, p. 200). More recently, researchers have studied effective cockpit crews in aircraft and have found that better briefing leads to better performance. This is relevant because in cockpits, as well as on fire lines, people often work with strangers. In particular, effective leaders establish and reaffirm norms of conduct for behavior in the group, and insist that people keep each other informed on what they were doing and the reasons for their actions and the situational model that gave rise to those reasons and actions. Almost no one at Mann Gulch or South Canyon heard someone say,
1. Here's what I think we face:
2. Here's what I think we should do;
3. Here's why;
4. Here's what we should keep our eye on;
5. Now, talk to me.
Leadership. But Mann Gulch and South Canyon are similar not only in their casual briefings. There was uncertainty about leadership in both cases. At Mann Gulch, leadership moved uneasily among Navon, Hellman & Dodge. At South Canyon, it moved uneasily among Blanco, Mackey, Longanecker, Shephard, among others. At Mann Gulch, as at South Canyon, crew members were not closely acquainted with their foremen due to [the] continual rotation of people among crews and assignments. (Fite, 1949, p. 28). Dodge knew only 3 people in his crew, Hellman, McVey, and Thol (Dodge, 1949, p. 125). Hellman, who was better acquainted with the men (Dodge, 1949, p. 125) was near the front of the line as they raced uphill (Sallee, 1949, p. 76) and reportedly said "to hell with that, I'm getting out of here," when Dodge ordered people to jump into his escape fire.
At Mann Gulch people were torn between 2 conflicting influences. But, the same thing happened at South Canyon. Haugh and Erickson both yelled at the retreating Hotshots to drop their tools (Report, 1994, p. 16) and run for the ridge while Thrash, who was at the head of the line of jumpers and hotshots stopped and began to deploy his fire shelter as did smokejumper Roth. Hipke and Blecha said in essence, to hell with that, I'm getting out of here and continued to run.
This similarity may be merely a coincidence. It may be more significant. It seems worth exploring, however, because it adds uncertainty to a situation that already has lots of puzzles. Uncertainty about leaders puts increased demands on crews, dispatchers, and pilots at a time when they are close to overload. Uncertainty pulls groups apart which, makes them more susceptible to panic (Weick, 1993, pp. 637-638). And uncertainty in the face of unclear leadership often cuts off the flow of information because people don't know who to send it to and responsibility keeps shifting at will. As we will see later, uncertainties about leadership were not confined to South Canyon. They extended up through the organization and this sets the tone for actions reflected throughout the organization.
Tools. A small, but powerful similarity between Mann Gulch and South Canyon is that, in both cases, when people were fleeing the blowup and were told to drop their tools so they could move faster, some resisted. Several calculations suggest that this resistance may have cost them their lives (Report, 1994, p. A3-5). They would have been able to move 15-20% faster (Putnam, 1994) without their packs and tools. Firefighters are not the only people who are reluctant to drop their tools. Naval seamen on ships are trained to wear steel-toed shoes at all times and often refuse to take them off when they are ordered to abandon a sinking ship. Fighter pilots report being reluctant to eject from the "warm womb" and "cocoon" of oxygen in a cockpit that is out of control into a far more harsh environment. It is just as hard to drop shoes or an aircraft as it is to drop a pulaski and a pack.
At Mann Gulch, Dodge told his crew to "drop all heavy tools" 200 yards after they turned upslope. According to Sallee (1949, pp. 75-76) and Rumsey (1949, p. 103) people either threw away everything or nothing. Dodge in his testimony said he "didn't know until later that they had discarded shovels and pulaskis" (1949, p. 118). Sallee reported that with the fire racing at them, smokechaser Harrison was sitting resting "and he still had his pack on" (Sallee, 1949, p. 88).
This same pattern was repeated at South Canyon. Some of the smokejumpers who deployed their shelters above the lunch spot, did drop their tools. But in doing so, they were struck by the enormous symbolic significance of what they were doing. One observed that putting down a saw was like running up a white flag (Rhoades statement); another (Petrilli), that the " Pucker factor" went up a notch (Report, 1994, p. A5-69).
What about those who didn't drop their tools? If dropping your tools signifies you're in deep trouble, keeping them may help you feel you're safe. To hold onto your tools is to stay in control, to remain a firefighter rather than a victim, to appear calm. I'm still in it. This is not just an issue of symbolism since tools are needed to scrape an area clear before deploying a fire shelter. But the reluctance to drop tools may come from other sources such as economics, habits, avoidance of failure, predictions of fire behavior, and social dynamics. Equipment is expensive and jumpers, at least, are told repeatedly and early in their training to carry out everything that is dropped to them. Habits built up during training are much more likely to involve moving with tools in hand, rather than moving and discarding tools. People have no idea what it feels like to run and discard tools or even how to do it. Rhoades in his statement mentions that as he was running to escape the South Canyon fire he kept looking for a place to put the saw down so it wouldn't get burned, a search which undoubtedly slowed his progress. In his words, "at some point, about 300 yds. up the hill....I then realized I still had my saw over my shoulder! I irrationally started looking for a place to put it down where it wouldn't get burned. I found a place I it (sic) didn't, though the others' saws did. I remember thinking I can't believe I'm putting down my saw." These words have even more impact when it is recalled that, among the fatalities, firefighter #10 (Putnam, 1994) was found with a saw handle still in his hand. To discard one's tools may signify more than giving up control, it may also be an admission of failure which, in a "can do" culture, is a devastating thing to admit. [Putnam, Ted. 1994. Analysis of Escape Efforts and Personal Protective Equipment on the South Canyon Fire. Missoula, MT: USDA Forest Service, Missoula Technology and Development Center]
There is a further complication with the seemingly simple act of dropping one's tools. If people drop their tools, they still face a tough choice, namely, do I now run faster or do I stop and deploy my shelter? It is tough to do both although some people at South Canyon tried. Running faster and stopping to deploy are incompatible and uncertainty about which one to do may compel people simply to keep doing more of what they are already doing, namely, running with tools. To keep running is to postpone having to make a tougher choice, especially if the person feels both exhausted and uncertain how safe the shelter really is. People may also hold onto tools because their predictions of fire behavior suggest that the fire won't reach them. This is a clear possibility at South Canyon. As the fire moved toward the hotshots and jumpers moving North along the fireline, it repeatedly was channeled toward the ridgeline along draws that ran at right angles to their movement. This fire behavior could have created the impression that the crew was at the flank rather than the head of the fire which meant there was no need to drop tools.
Finally, people may hold onto their tools as a simple result of social dynamics when people are lined up. If the first person in a line of people moving up an escape route keeps his or her tools, then the second person in line who sees this may conclude that the first person is not scared. Having concluded that there is no cause for worry or that I'm not going to be the only one who goes back without tools, the second person also retains his or her tools and is observed to do so by the third person in line who similarly infers less danger than may exist. Each person individually may be fearful, but mistakenly concludes that everyone else is calm. Thus, the situation appears to be safe except that no one actually believes that it is. The actions of the last person in line, the one whose back feels most intensely the heat of the blowup are observed by no one, which means it is tough to convey the gravity of the situation back up to the front of the line.
What hasn't changed in 45 years is the power of symbols. Packs and saws may be heavy and slow one's pace. But that may be one of their less important qualities. More significant may be their ability to reduce one's sense of danger. If throwing tools is a sign of surrender, keeping them may be a sign of a standoff or victory. It may be important for trainers to emphasize, "Look people, you're going to want to hang onto this stuff. Don't! It could cost you your life.["]
Wisdom. The fourth aspect of my Mann Gulch analysis that fits South Canyon centers on the idea of wisdom. To understand why the idea of wisdom fits here, you need to understand first that wisdom is a mixture of knowledge and ignorance. When one of them grows, so does the other. To know something better is also to discover that new questions about it are raised. Wisdom is an attitude that what you know is only part of what could be known, and therefore, you need to stay alert. You need to avoid excess confidence that you know everything and excess caution that you know nothing, if you want to stay flexible.
Wise organizations know what they don't know. They know two things: first, they know that they have not experienced all possible failure modes and second, they know that their technology is still capable of generating surprises (Schulman, 1993). Thus, when they act on the basis of their past experience, wise organizations act as if that experience is both credible and limited. They simultaneously believe and doubt they know what is up. Consider the case of a near miss or a close call. The fascinating thing about a near miss is that, "Every time a pilot avoids a collision, the event provides evidence both for the threat and for its irrelevance. It is not clear whether the learning should emphasize how close the organization came to disaster, thus the reality of danger in the guise of safety, or the fact that disaster was avoided, thus the reality of safety in the guise of danger" (March, Sproull, and Tamuz, 1991, p. 10). If the moment is interpreted as safety in the guise of danger, then learning should be diminished because "more thorough investigations, more accurate reporting, deeper imagination, and greater sharing of information" are all discouraged (Sagan, 1993, p. 247). The attitude of wisdom sees a near miss as evidence that the system is both safe and vulnerable, that people must remain alert, and that a safe environment is not measured by an absence of accidents (that outcome is largely dependent on luck), but is the result of active identification of hazards and their elimination (Allinson, 1993, p. 186). [This author avoids and denies the entire "luck" thing]
At Mann Gulch, people believed they were fighting a fire that would be out by 10:00 the next morning and failed to raise questions about whether this expectation remained accurate. At South Canyon people believed they could "hook" the fire before the winds would build and they presumed that lookouts and a commander had the big picture even though the firefighters had seen no evidence of this.
The attitude of wisdom is one way to remain alert, because it leads people to remain open to what is happening and to rely cautiously on their past experience. I've always been struck by evidence suggesting that there are certain periods during a person's career, when they are most in danger of getting injured or killed. Police, for example, are in most danger of being shot during their 5th year on the force. Firefighters are in most danger of fireline accidents either in their first 2 years or after 10-15 years of experience (Pyne, 1984, p. 391). Young firefighters are vulnerable because of their inability to recognize hazardous situations. The more experienced firefighters are vulnerable because they presume they've seen it all, they have less openness to new data, thus the validity of their models decreases. The unexpected gets them.
Crews and commanders need to keep learning and updating their models. This won't happen if they presume that nothing about fires can surprise them, if near misses are treated as testimonials to safe practices, and if they are certain that they've experienced all possible ways in which a system can fail. These attitudes won't change if they reflect similar attitudes in top management. You may recall that Maclean felt "the Forest Service wanted to downplay the explosive nature of the Mann Gulch fire to protect itself against public charges that its ignorance of fire behavior was responsible for the tragedy" (Maclean, 1992, p. 125). The key word there is "ignorance." The service doesn't want to appear ignorant. Nor do [its] crews. The price of creating this impression may be a loss in vigilance, learning, and wisdom.
It is tempting in a world of boldness and aggressive attacks, to conclude that there is no place for doubt. But as Thoele (1994) has suggested the best firefighters do not confuse risk with recklessness, and they are able "to say ' no' without sustaining dents in their machismo" (p. 28). That's what wisdom is about, and why it's worth striving for.
"Whenever there is any doubt, there is no doubt. ... All good things come to those who wait" Ronin movie quotes
Differences Between Mann Gulch and South Canyon
Discrepancies Between Beliefs and Actions. Having suggested at least 4 ways in which dynamics of organizing in South Canyon replay themes that unfolded earlier in Mann Gulch, I now want to explore some additional issues that were less visible in Mann Gulch but that stand out in South Canyon.
The first of these is the unusually large number of inconsistencies between beliefs and actions at South Canyon. I want to dwell on these because they suggest one reason why people persisted so long doing things that violated fire orders and watch outs.
A recurring belief among people fighting wildland fires is that some of the fires they fight are on worthless land. This was a prominent issue at Mann Gulch. As Earl Cooley (1984) put it, "One of the main questions was why we risked lives and spent many thousands of dollars to save scrubby timber and cheatgrass" (p. 91). A basic discrepancy that firefighters and overhead face over and over is between their belief that the land is worthless and the reality that they are risking their lives to defend it. The action of defending is inconsistent with the belief that the area is worthless. Contradictions such as this cause tension and continue to do so until the person either changes the belief—the land is more valuable than it looks—or changes the action, and uses low priority suppression tactics. Either change reduces the inconsistency.
Let's extend this scenario to South Canyon and a key decision, the decision made at 9:30 the morning of July 6 to cut a direct fireline, downhill (Report, 1994, p. A4-6). What is noteworthy about this decision is that it involves a troublesome discrepancy. Building direct line downhill is dangerous. Longanecker said, "going downhill direct is a bad deal" (Report, 1994, A5-52). Archuleta asks, why are we punching in line? Erickson asks, "Where are the safe areas?" and hears the answer, "there really aren't any." Rhoades, Doehring, and Shelton overhear this conversation. But the decision is made to build a direct line anyway, which leaves everyone tense. They believe that the action is dangerous, yet they are doing it. What makes this really troubling is that the decision is a public, irrevocable, choice. There is good research evidence (e.g., O'Reilly and Caldwell, 1981; Salancik, 1977) that when people make choices of this kind, they are more likely to change their beliefs so that they become consistent with the action they are now committed to. In this case people should begin to believe that building direct line downhill is safe after all in order to justify what they are actually doing.
And that's what seemed to happen. Listen to how Quentin Rhoades in his own words, handled things: "I resolved not to go down that hill digging line . . . Smokejumpers arrived and started digging line. I remember thinking that I must have missed something. I hadn't been on a fire since August 18, 1992 and I felt a little green." Rhoades convinces himself that the main reason the situation seems dangerous is that it's his fault, he's rusty, he's missed something, which means the situation is not as dangerous as it looks. Other people resolve the discrepancy in other ways. They convince themselves that the leaders know what they're doing, that it won't take long to cut the line, that the predicted weather front won't be that strong, that they can "hook the fire before the front passed" (Report, 1994, p. A5-53), that the crews are really on top of this job, and that more resources are coming (Report, 1994, p. A5-47). There is a grain of truth in all of those explanations. But people also have a stake in needing them to be true, since they reduce the tension associated with doing something they believe to be dangerous. The trouble is, they now have a vested interest in not seeing warning signals. If they do notice these signals, then their whole sense of what is happening collapses. Listen again to what Rhoades says: "My ditty bag contained a copy of standard fire orders and watch situations. I considered looking at it, but didn't. I knew we were violating too many to contemplate."
When people take public, irrevocable actions for which they feel responsibility, their mind set is to justify those actions and to assemble evidence that shows the action makes sense (Ross & Staw, 1986). They are not indifferent toward evidence that raises doubt about the action. Instead[,] they avoid, discredit, ignore, or minimize this contrary evidence and keep looking for positive reasons that justify continuing the action. People who justify their actions persist, or in the words of the investigating team, "strategy and tactics were not adjusted to compensate for observed and potential extreme fire behavior" (p. 35).
I have dwelt on this one decision at South Canyon to show how people justify their actions and in doing so, become more committed to continuing those actions. There are several other discrepancies that could be analyzed the same way, such as the belief that this was a low priority fire yet Type 1 crews were put on it; the policy that two or three trees burning is a standard smokejumper dispatch (French), yet jumpers were not dispatched immediately; the belief that this is a potentially serious fire, yet a crew walks off it the night of the 5th; the belief that retardant works only at certain stages of a fire, yet requests for it at that stage are refused; aerial reconnaissance that spots fingers of fire in west drainage on July 6, yet these are not drawn on the map (Report, 1994, pp. 26, A5-70). My point is not simply that there were discrepancies at South Canyon. Life is full of discrepancies and people manage to deal with them by sizing up pro and con evidence. My point is that, key discrepancies at South Canyon seemed to occur in a context where people got locked into public irrevocable, volitional actions, and had to justify those actions. These justifications made them more committed to those actions, which led them to persist longer in executing those actions despite growing dangers. Notice that the people who would be spared from this process of escalation would be those who were forced to cut line (there is low choice), people who saw escape routes, (the action is revocable) and people who did not express their views in public (the decision is not linked to them as individuals).
Levels of Experience. Earlier I mentioned that experience has both an upside and a downside. The upside is that it gives you more patterns that can be retrieved and matched with current puzzles to make sense of them. The downside is that more experience can sometimes lead to less openness to novel inputs and less updating of the models one uses. Failures to revise often produce ugly surprises.
I want to dig deeper into the issue of experience levels at South Canyon, partly because the accident investigation team seemed reluctant to do so. I say this because if you look at the Fire Entrapment Investigation and Review Guidelines (Report, 1994, pp. A12-3 to A12-11) which they followed religiously in structuring their report, the only category out of the 28 that they omitted was category 23, "V. Involved personnel profiles - Experience levels" (Report, 1994, p. A12-7). This omission may be due to the fact that, on paper everyone is qualified. But just because they're qualified on paper, doesn't mean that their experience is deployed well in this incident or sufficient to handle its changing character or easily adapted to it. Issues of experience levels at South Canyon are complicated, difficult to untangle, and touchy when untangled. But that's no reason to avoid them.
The overall level of relevant experience for leadership appears to be low. Several people appear to be in over their heads, which gives a whole new and somewhat chilling connotation to the personnel category, "Overhead." Experience is unevenly distributed across the several activities at South Canyon and does not always line up with authority. There are no clear mechanisms to mobilize and focus and implement the experience that is scattered around. And finally, everyone is accessing their experience under increasing amounts of stress, which means they are likely to fall back on those habits and understandings they have overlearned (Weick, 1990, pp. 576-577). Unfortunately, these may be the very habits and understandings that are least relevant to the unique conditions in South Canyon.
There are at least three reasons we need to tackle the issue of experience and how it is mobilized. First, an important finding from studies of high reliability organizations is that they have multiple structures. Aircraft carriers, for example, have a bureaucratic hierarchical structure for normal functioning during slack times, a different structure built around expertise for "high tempo" periods of extended flight operations, and a third structure explicitly designed for emergencies. High tempo structures are especially relevant for wildland firefighting where rank in the formal hierarchy does not always coincide with technical expertise. LaPorte and Consolini (1991) describe a high tempo structure on carriers this way: "Contingencies may arise that threaten potential failures and increase the risk of harm and loss of operational capacity. In the face of such surprises, there is a need for rapid adjustment that can only rarely be directed from hierarchical levels that are removed from the arena of operational problems. As would be expected, superiors have difficulty in comprehending enough about the technical or operational situation to intervene in a timely, confident way. In such times, organizational norms dictate noninterference with operators, who are expected to use considerable discretion."
Authority patterns shift to a basis of functional skill. Collegial authority (and decision) patterns overlay bureaucratic ones as the tempo of operations increases. Formal rank and status declines as a reason for obedience. Hierarchical rank defers to the technical expertise often held by those of lower formal rank. Chiefs (senior noncommissioned officers) advise commanders, gently direct lieutenants, and cow ensigns. Criticality, hazards, and sophistication of operations prompt a kind of functional discipline, a professionalization of the work teams. Feedback and (sometimes conflictual) negotiations increase in importance; feedback about 'how goes it' is sought and valued'" (p.32).
People in South Canyon did not seem to have the capability to form a high tempo structure where influence flowed from expertise and experience, rather than from the formal chain of command. In part, the problem was that it was never clear where the relevant expertise was located so that the structure could form around it. Furthermore, there was no clear chain of command that could defer to more experienced people nor was there a clearly understood set of signals by which such a shift in structure could be conveyed immediately and unequivocally to everyone.
A second reason the issue of experience is important is because it has the potential to create a smarter system that senses more. A key idea in system design is the notion of requisite variety: it takes a complex system to comprehend a complex environment (Miller, 1993). Analyses of South Canyon that are consistent with this principle have already begun to appear. For example, Topic 3.5 in the IMRT review states that managers should "match qualified incident commanders with the complexity of incidents" (Wildfire, Vol. 3, No. 4, Dec. 1994, p. 46). That's requisite variety. Inadequate requisite variety occurs when a less complex incident commander, or a less complex jumper crew, or a less complex dispatcher, cannot adequately comprehend a more complex event.
Requisite variety that is more adequate can be illustrated by a crew of smokejumpers who have had prior experience as hotshots. Such a crew has the capability to function either in a more independent jumper mode or a more disciplined hotshot mode, which gives them a larger variety of ways to cope with a larger variety of fire behaviors.
The notion of requisite variety also alerts us to a hidden danger in successful firefighting. There is growing evidence that success leads to system simplification (Miller, 1993), which means successful systems steadily become less sensitive to complex changes around them. This insensitivity culminates in a sudden string of failures and the horrifying realization that one has become obsolete and faces a nasty, prolonged period of playing catch-up.
Again, the lesson from high reliability organizations such as the Diablo Canyon nuclear power plant is the need to cultivate diverse experiences, variety, multiple points of view, and conceptual slack (Schulman, 1993) so that people have a better sense of the complexity they face. And, there also need to be well-learned, trusted, procedures to handle the inevitable conflicts that arise when people make different interpretations, such as when a Fire Management Officer and a Hotshot superintendent differ on how the fire should be fought.
The third and final nuance of experience that I want to raise is the question of what happens when you are at the limits of your experience where demands exceed capabilities? And what can be done about it?
For the sake of illustration, let's look at jumper Mackey who was jumper-in charge at South Canyon and who had just recently been given a permanent appointment. What's interesting and troubling about Mackey's position is that the system makes it hard for him to do a good job on this fire. If we put ourselves in Mackey's shoes we discover that he is in a bad spot almost from the start.
He starts with a sloppy hand-off the evening of July 5 and an unfinished project which he is unable to continue. He's dropped on unfamiliar terrain, at twilight, with rolling debris and steep slopes. The crew is unable to get much sleep. The resources (two Type 1 crews) that Mackey requests the night of the 5th arrive in small numbers at unpredictable intervals the next day (8 jumpers at 10:00 a.m., 10 hotshots at 12:30 p.m., another 10 hotshots at 3:00 p.m.) and Mackey is not even sure they'll come at all since he's been told his fire is low priority. When there is disagreement about building line direct and downhill, the incident commander does not resolve it and the hotshot superintendent does not seem to question the strategy when he arrives around noon (Report, 1994 pp. A4-6, A4-7).
At some level[,] Mackey knows the downhill strategy is risky because, in response to a flare-up at 10:35 AM, he begins to pull the crew out (Report, 1994 p. A5-70) only to have that decision questioned by Longanecker who suggests doing bucket drops. The drops are made and the crew resumes cutting line. Not long after this Rhoades observed that "Don looked terrible." Still later, when the saw Rhoades is using breaks down, Mackey offered to sharpen it and help him cut line. This looks like a clear instance of a person falling back on overlearned behavior when that person is under pressure. Mackey discards the less familiar activity of keeping your head up and supervising for the more familiar activity of keeping your head down and cutting line.
I mention this example to make the point that when demands exceed capabilities, which is the basic condition under which people experience stress (McGrath, 1976), this is seldom simply the fault of an individual. The buck doesn't stop with that person. Instead, the buck stops everywhere (Allinson, 1993). The people around Mackey made his assignment harder and reduced his capabilities to handle it. The resulting pressure made it harder for Mackey to gain access to the experience he already had, which increased pressure when his decisions were questioned, which gave him even less access to his experience until he was caught in a vicious circle where he did what he had always done on fires, namely cut line rather than supervise. The Hotshots had no idea something like this might be developing, and when they saw Mackey, he seemed to be moving around and checking, which is what overhead is supposed to do.
The system let Mackey down. It did little to remove or redistribute pressures, it did little to simplify his assignment, and it did little to monitor the fact that he and others had less and less energy to cope with growing complexities. The crew was losing variety and alertness, and no one spotted this or slowed the loss, or altered the work so that whatever alertness remained was sufficient.
Figure 7. Federal employees burnover fatalities pie chart (1990-2008) 37% burnovers. Source: CPS