What Outdoor Programs Can Learn From Aviation

How culture shapes policies, curricula, and other task objects that influence safety

 

On July 6, 2013, Asiana Airlines Flight 214 departed Incheon International Airport in South Korea toward San Francisco (SFO).

Upon the descent, the Boeing 777 approached too low and too slow. The forward speed was slower than required, causing the plane to descend too fast. The pilot in charge took over from a trainee and initiated a ‘go-around’, but it was too little, too late.

The tail of the plane collided with the seawall at the edge of the runway and burst into flames. Three passengers died, and dozens more were seriously injured.

Continuing the series about how national culture shapes safety, this post examines what outdoor programs can learn from aviation about cultural risk. It outlines what cultural risk is, and how to identify the hidden assumptions that shape it in the forms, policies, and other tools used in everyday work.

Case Study: Asiana 214

As the aircraft descended toward the runway, the crew believed the automated systems were managing speed.

They were not.

The autothrottle had disengaged, and airspeed waned without correction. By the time the instructor pilot intervened, the aircraft was too low and too slow to recover.

The investigation found no mechanical failure. Instead, it found that the crew was confused about what the automation was doing and delayed manual intervention.

Cultural norms around authority and hierarchy made it difficult for junior crew members to speak up. The trainee pilot later told investigators that initiating a go-around was difficult because, even though he was seated in the captain's chair, he deferred to the instructor pilot's seniority: “…That is very hard to explain, that is our culture” (Chow et al., 2014).

The NTSB investigation (2014) found good reason for the pilots' confusion. These factors were in place well before the pilots stepped foot in the cockpit:

  • Asiana's training emphasized reliance on automation, a cultural preference that, according to researchers, differs from East to West (Sherman et al., 1997).

  • Boeing built and sold its training curriculum to Asiana, which encoded Western assumptions about how pilots should interact with automation, and to manually intervene when needed.

  • At the regulatory level, the FAA knew that culture and language differences increased risk and had recommended further study as early as 1996. They argued for minimum-speed protection, meaning that the autothrottle should engage on its own at a minimum speed. Boeing did not include it. The FAA accepted the design anyway.

The crew followed the procedures. The autothrottle worked as designed, and the training curriculum delivered exactly what its designers intended. What happened in the cockpit that day was a clash between the hidden assumptions embedded in the design of these task objects, and the cultural norms of the flight crew involved.

The system behaved exactly as it was designed.

How Common Tools Carry Hidden Assumptions

Task objects are the artifacts of everyday work. They guide planning, interpretation, and decision-making through tools and documents like manuals, policies, procedures, checklists, plans, forms, templates, and training curricula.

Task objects are an essential ingredient in defining the container for downstream decisions and actions. They prescribe who is responsible for what, what information gets gathered, and who decides. For example, who can modify a program itinerary is determined by an activity policy and/or a job description. A medical form determines the depth of information families share, and a training curriculum tells staff what to pay attention to and how to react.

In the case of Asiana 214, the airline’s training curriculum framed what pilots were prepared to notice and how they were expected to respond.

Cultural risk arises when a task object, encoded with assumptions about what is relevant, normal, or worth paying attention to, clashes with the norms and beliefs of the people using the object (Slay et al., 2025). The designers of these objects, like the regulators, engineers, and administrators who set standards, build airplanes, and design forms, base the logic of their designs on their own cultural norms and preferences; for example, the preference and reliance on automation versus manual control. The users of these artifacts interpret the same object through their own cultural frameworks, which influence what they notice, account for, or ignore.

When those cultural frameworks align, the object is more likely to work as intended. When they don’t, important signals are omitted, misunderstood, or bypassed as they are treated as irrelevant.

The risk is not that people ignore a rapidly descending airplane or a question on a form. Most people don’t have a death wish; they want things to go smoothly and safely.

The risk is that they interact with the object in a different way than the designer intended, and important safety signals are not seen or sent.

Health Questionnaires

For example, if the program's plans and expectations for an activity are unclear, a parent has to work harder to interpret and disclose the information they think is most relevant about their child so staff can make the best decisions for the child’s safety. These decisions can be culturally coded, for instance, mental health, and the openness or closedness about mental health in different cultural contexts.

Consent Forms

Or, swimming, and gauging a child’s water comfort and ability, relevant to the aquatic environment the parent expects the child to engage with (see How cultural factors shaped the Mangatepopo Gorge tragedy).

Sometimes, these mismatches add up to harmless confusion. But other times, and especially when cultural power dynamics between ‘expert’ school or program staff and ‘novice’ parents are at play, caregivers may be inadvertently pressured, even coerced, to give consent when they otherwise would not, without a complete, true understanding of the nature of the risks involved.

Training Curricula

Assumptions about challenge, growth, and risk management are embedded in training curricula and ultimately shape staff beliefs about the program’s purpose and how they view their jobs (see How culture shapes common beliefs about outdoor education).

Staff carry these beliefs from program to program and region to region. When those assumptions are born from specific cultural constructs, and those constructs misalign with the cultural context of participants and program locations, staff’s decisions and actions may not be as suitable for the context they’re in.

In some cases, these misalignments add up to confusion or a funny story later. But at worst, these misunderstandings contribute to psychosocial or physical harm. This is exactly what happened to Asiana 214, where the training curriculum encoded assumptions about automation and authority that failed to hold up in the cockpit that day.

Where to Start

Not all task objects present the same likelihood of introducing cultural risks. I found that the most important ones are those that determine a) what information is gathered, b) how risk is framed, and c) who holds decision-making power (Slay, 2020).

You should examine the underlying design logic of these task objects in your program:

  • Risk assessments

  • Program curriculum and staff training curriculum

  • Program briefing materials

  • Policies, procedures, and SOPs

  • Safety briefing materials and checklists

  • Emergency response plans

  • Consent forms and information disclosures, including health forms, and templated parent and student presentation and preparation materials

How to Find Cultural Risk Embedded in a Task Object

When reviewing a task object, ask:

  1. What assumptions does this object make about knowledge, values, and experience?
    A consent form for a swimming activity that inadequately describes the intended swimming environment.

  2. Under which situations is an action mandatory, versus where can judgment and preference inform action?
    A code of conduct that requires individual completion of a program activity rather than individual contribution to the group’s success and safety in an activity.

  3. What forms of uncertainty, dissent, or reinterpretation does it allow or suppress?
    A safety briefing checklist that omits prompting the instructor to check in with group members to gauge understanding or address anxiety about the activity’s requirements.

  4. Which cultural perspectives are centered, and which are secondary?
    A curriculum that favors learning for individual self-discovery and expression over learning through relationship to others and positive contribution to the group process.

  5. What would a reasonable person from a different cultural context misunderstand, resist, or reinterpret here?
    An emergency response plan that lists a series of actions but doesn’t define what constitutes an emergency or who has the authority to declare one.

If the purpose and intended use of a form, checklist, or policy is unclear to its users or otherwise culturally misaligned, training people more does not necessarily solve the problem. Similarly, if the tool itself contributes to poor or incomplete decisions, enforcing its use more will only reinforce the problem.

The purpose of reviewing task objects is to decide whether the tool needs more information, clearer language, or a redesign before expecting people to use it well.

 

Safety isn't just about what people do. It's about what the tools asked them to do in the first place.

 

References

Chow, C., Tuleja, E. A., & Yu, J. (2014). The impact of cultural values on communication in multicultural crews: The case of Asiana Flight 214. International Journal of Cross Cultural Management, 14(2), 191-208.

National Transportation Safety Board. (2014). Descent below visual glidepath and impact with Seawall Asiana Flight 214, Boeing 777-200ER, HL 7742, San Francisco, California, July 6, 2013 (Aircraft Accident Report NTSB/AAR-14/01). Author.

Sherman, P. J., Helmreich, R. L., & Merritt, A. C. (1997). National culture and flightdeck automation: Results of a multinational survey. International Journal of Aviation Psychology, 7(4), 311–329. http://doi.org/10.1207/s15327108ijap0704_4

Slay, S. (2020). How cultural perceptions influence risk: A new method for identifying cultural risk [Master's thesis, Prescott College].

Slay, S., Dallat, C., & Mitten, D. (2025). A cultural risk assessment of led outdoor activities. Journal of Outdoor Recreation, Education, and Leadership. https://doi.org/10.18666/JOREL-2025-12408

Stuart Slay

Stuart Slay is a safety leadership coach and consultant working with schools and outdoor activity programs. He is based in Taipei, Taiwan, and Seattle, Washington.

Previous
Previous

What I Learned About Experiential Education in a Korean Mountain Village

Next
Next

Through the Struggle