
Opportunity
Maj. Pete Mitchell (pseudonym) from Top Gun Academy (pseudonym), an organization dedicated to training student pilots to fly fighter aircraft, asked for the support of a team of human-performance practitioners to help him optimize their flight lead syllabus.
Top Gun Academy is a unit of F-15 fighter aircraft pilots and Weapons Systems Officers (WSO) who train student pilots and WSOs to fly the F-15. Students enter the Academy at various ability levels and with unique goals. The Academy’s mission is to train flight students to advance to the next level of aircrew aviator. To achieve that end, Academy instructors host a number of programs aimed at developing specific competencies. One of the programs Mitchell leads is the flight lead program in which aircrew wingmen have the opportunity to advance to the role of flight lead. A flight lead commands a formation team of two or more wingmen during a mission.
The current flight lead syllabus outlines a structured program designed to take existing wingmen–who are proficient in aircraft operation and individual task execution–and turn them into competent flight leads who oversee the strategy and execution of a given mission. After successful completion of three instructor-led courses and eleven graded flights, wingmen graduate as flight leads. Our client’s goal was to identify opportunities where the curriculum could be improved to more effectively train students to move through this program and competently take on the role of a flight lead.
Solution
A robust front-end analysis (FEA) allowed me to conclude that the performance problem could be solved through a training intervention. The learning solution came with two main recommendations:
1. Clarify learning objectives and improve evaluation tools.
| Current State | Proposed Future State |
|---|---|
| Each event contained teaching objectives, a high-level outline of instructional tasks and/or topics, and a list of reference materials. In addition, there was one generic rubric instructors used to assess learner performance of all 11 sorties despite the fact that each sortie required the learner to demonstrate unique knowledge and skills. | 1. Add performance-based learning objectives to each event (terminal). 2. Create one rubric per sortie that aligns with that event’s learning objectives. |
2. Improve brief/debrief performance outcomes through enhanced instruction.
| Current State | Proposed Future State |
|---|---|
| Though the client indicated that 90% of what the learner accomplishes in flight stems from delivering an effective brief and debrief, this represents only a small portion of the current curriculum. | 1. Draft terminal and enabling performance objectives for briefing and debriefing. 2. Create a more robust, instructor-led lesson plan that gives learners an opportunity to practice in a scaffolded environment. 3. Develop a job aid to support job transfer. 4. Create a structured skill check (formative) with access to just-in-time coaching/feedback to practice preparing and delivering briefs/debriefs. 5. Develop a skill assessment (summative) to measure learner proficiency. |
Approach
Step One: Analyze the performance problem
To better understand the performance problem, I performed a needs assessment in three phases. These phases are based on the Learning and Performance Support (LeaPS) model developed by OPWL faculty at Boise State University, a systematic approach to human performance improvement. It begins with analysis which informs the design and development of the learning intervention which, in turn, informs how the training is implemented and evaluated. The process is iterative, meaning that new information that surfaces in the design phase, for instance, may impact our understanding of the performance problem and cause us to reexamine the root cause. Or, the pilot may reveal some flaws in implementation that may compel us to reexamine the efficacy of a particular instructional object–or even to create a new one.
I started by performing three types of analyses:
- Front-End Analysis: Using the Training Requirements Analysis (TRA) questionnaire, I interviewed Mitchell and reviewed several courseware documents (i.e., syllabus, rubrics) to answer a series of targeted questions which allowed me to conclude that Top Gun Academy had a skill gap worth closing. One of the most salient gaps centered around preparing and delivering briefs and debriefs. As wingmen, learners have been on the receiving end of many briefs/debriefs, but have never been tasked with delivering them. The current program does not provide explicit instruction in these critical skills, relying on informal, experiential knowledge to do much of the heavy lifting. For this reason, I zeroed in on two of the four most salient gaps: preparing and delivering mission briefs.
- Environmental Analysis: Using the Learner and Environmental Analysis (LEA) questionnaire, I was able to identify what environmental factors would both support and detract from learning acquisition and job transfer. Top Gun Academy leans toward the tenets of a traditional learning community, structured around face-to-face instructor-led training in a classroom setting with clearly defined start and end hours. It is also well-equipped to provide real-world, hands-on training. Students can access classrooms, a training simulator, F-15 aircraft, hangars, runways, and airspace. They also have frequent and ready access to laptops, instructors, and subject matter experts for ongoing coaching and performance support.
- Learner Analysis: Using the same LEA questionnaire, I was able to piece together a clear profile of the typical flight lead student. These learners tend to be ambitious, driven, hard-working, and high-achieving individuals. They enter the program as certified wingmen, having already served in that capacity for an average of 1 to 1.5 years. Most students are male, ages 22-26, and college-educated.
After the learning solution received a green light from our project sponsor, I proceeded with a task analysis (TA), narrowing our focus from preparing and delivering briefs generally to preparing and delivering a brief for an air combat maneuver (ACM) mission. Using the Task Analysis questionnaire as a scaffold, I engaged Mitchell in several 1:1 interviews during which we answered the question, “How does a flight lead deliver an effective ACM mission brief?” By the end of that process, we captured an accurate, complete, and authentic record of the task. This TA guided the subsequent design and development of our courseware.
Step Two: Design The Learning Solution
With the analysis work under my belt, I was able to move into the design phase confident about what our learners needed to be able to do in order to master this task. I defined our terminal objective as: “By the end of CU-2, flight lead candidates will be able to prepare and deliver a ≥ 60-minute mission brief for an ACM that contains all data needed to adequately equip flight formation members.” In order to do this with efficacy, I mapped out our enabling objectives:
By the end of this lesson, flight lead candidates will be able to:
- Describe the purpose of a brief.
- Prepare a briefing for an air combat maneuvers (ACM) mission.
- Deliver a thorough ACM brief to formation members.
Using these objectives as my North Star, I set out to create a lesson plan. Using the principles of performance-based training, we sought to create a learning experience broken into three parts:
- Guided observation
- Guided practice
- Demonstration of mastery
To do this, we structured the lesson so that learners could see an example of a portion of the briefing, assess the delivery of that example using a rubric, practice that portion in a low-stakes environment, and receive feedback from peers and the instructor, then put all the pieces together in a summative assessment at the end of the lesson. In this way, learners could bridge the gap between their prior experience as recipients of briefings and transform into facilitators of briefings, using objective criteria to guide their performance.
Step Three: Develop Instructional Materials
In addition to a lesson plan, I also needed to create some helpful performance-support tools. In sum, the course’s final deliverables included four distinct tools:
Facilitator Guide: This 13-page facilitator guide is instructor-facing and provides them with a detailed breakdown of the courses that guide students through how to prepare and deliver an ACM brief. It includes terminal and enabling objectives, prerequisites, a curriculum map, facilitator activities, methods, learner activities, and section durations.
ACM Briefing Job Aid: This job aid is learner-facing and designed to provide students with prompts that get them to think critically about what details they need to include in each section of an ACM brief. It also includes the target duration of all five sections as well as the relevant reference materials to access, as applicable.
ACM Briefing Rubric: This rubric is both learner- and instructor-facing and provides both parties with the success criteria for delivering an ACM brief. For the learner, this clears up any misconceptions about what should be included in their briefing. For instructors, it provides a clear, consistent tool they can use to coach, remediate, and/or validate learner proficiency.
ACM Debriefing Rubric: Though the instructional materials do not explicitly cover the preparation and delivery of an ACM debriefing, they do provide a model for the instructional team going forward. Like the ACM Briefing Rubric, this rubric is the first of its kind and provides both learners and instructors with an objective set of criteria to assess learner proficiency.
ReSults
Client Impact
The client entered this relationship believing he only needed some slight adjustments to the course syllabus, but left the experience appreciating that the entire program would benefit from a more robust, HPI-centered overhaul.
During the analysis phase, it became evident that the program goals didn’t always hew to the learning objectives stated in the course. Further, the courseware often failed to prepare learners to perform those very objectives.
This isn’t uncommon within the realm of workplace learning.
One of the best parts about working in this field is seeing how these efforts make an impact, and I’m so proud of my team for creating a solution that packs a punch!
Lessons Learned
The biggest challenge at the outset was the ambiguity of the ask. The broad parameters forced me to drill down with our client to get to the heart of a performance problem and how we could meaningfully intervene. Once the opportunity became clear, however, I was so pleased to see how the application of the LeaPS model, as well as other tried-and-true systemic methods, allowed us to conceive of a learning intervention that helped move the needle for our client.
Another challenge was the classified nature of the work. Though I was able to review some valuable extant data (e.g., course syllabus, existing rubrics), I did not have access to the primary textbook, existing courseware (e.g., slide decks), or formative assessments (e.g., graded student work) that otherwise would have allowed me to triangulate inputs. I also lacked access to current or former program students or other instructors who would have been able to provide a more three-dimensional snapshot of the program’s current state–and perhaps additional improvement recommendations.
References
- OPWL 537 Course Instructors (2021). Instructional design course handbook (4th ed.) Department of Organizational Performance and Workplace Learning, Boise State University.
- Rothwell, W., Benscoter, B., King, M., & King, S. B. (2016). Mastering the instructional design process: A systematic approach (5th ed.) John Wiley and Sons.
