As the maturing community integrates new concepts and processes, Multi-Domain Operators must identify and engrain the valuable lessons along the way. Creating a set of standards to capture feedback and drive improvement is vital for development in any organization.
The debrief culture of the Air Force fighter community, among others, is well-known for its direct, highly effective feedback and learning methods. This type of focused feedback is important to the fighter community because the debrief is where the majority of learning takes place. The MDO community would benefit greatly by utilizing this debrief culture as a model from which to develop its own unique culture of consistent, iterative improvement.
Because a standard day, or sortie-equivalent, is not yet fully fleshed out for Multi-Domain Operators, the purpose of this report is to convey the necessity for debriefing lessons learned, and provide best practices in their current form. The ultimate objective is to create a foundation for the MDO community to adapt these practices as the details and nuance of its daily execution become more specific and clear.
The first step to enable this advancement is instilling a culture of debrief, direct feedback, and constructive learning within the MDO community. Many communities across the Air Force embrace a debrief culture, though some have unique formats and standards to tailor learning to respective needs. The debrief is designed to focus analysis on either the accomplishment or failure to accomplish desired learning objectives (DLOs) and/or mission objectives.
Mission objectives drive the planning or execution items that must be accomplished to be successful and therefore expose the areas of individual, crew, or team performance that must be addressed to correct for future iterations. Regardless of distinctive design, any effective debrief identifies errors and provides fixes for those errors, while also allowing those who did not directly commit a given error the advantage of learning from others’ mistakes.
Since there is not enough time for each operator to make all the mistakes, this type of learning creates efficiency by reducing repetitive errors across the group that is present for a given debrief. Now, multiply this effect across entire communities.
The fighter aviation community has refined its debrief process over several decades; it is fundamental to fighter culture. Any organization can utilize fighter debrief concepts as a reference—or even baseline—to develop its own culture of debrief. Being composed of personnel from many different career fields and backgrounds, the MDO community must be deliberate about, and dedicated to, the development of appropriate debrief formats and standards.
Since the MDO process is still early in its development, it is critical to build the foundation of this debrief culture in the Multi-Domain Warfare Officer training center and Air Command and Staff Multi-Domain Operational Strategist (MDOS) concentration (soon to become JADS – Joint All-Domain Strategists).
One way to achieve this is for the training centers to leverage the proven fighter debrief process in establishing an MDO debrief methodology. This can inform the MDO community’s initial, essential steps in developing a format and standards for efficient and effective feedback.
To understand fighter debrief culture in a way that helps the MDO community relate it to the eventual structure of an MDO day, or an MDO mission, it is important to describe that fighter culture in its native context. Debrief has always been an important part of fighter aviation culture, facilitating honest and direct feedback on every mission element.
As Combat Air Force (CAF) flying hours continue to decrease, debrief has become even more important to ensuring everyone receives required training. Additionally, operations tempo require debriefs to be direct and succinct, due to the limited time available after mission planning, briefing, and flying the mission. By the time the debrief starts, aircrew likely have already been at work for a full day.
To maintain focus and aid efficiency, debriefers commonly use the mantra “Plan, Products, Brief, Administration, Tactical Admin, and Execution” to address all portions of the mission. At the beginning of the debrief, it is helpful to keep sections like “Brief” as simple as possible by asking, “was there anything from the brief negatively affecting your execution today or that you have questions on?”
Directing this question to the room allows the debriefer to quickly address pre-execution issues, and then move to the mission itself. However, the brief may have negatively affected execution in a way that remains to be determined in debrief, so it should also be considered during the debrief focus point (DFP) development.
Utilizing this debrief structure, the debriefer quickly addresses issues in each pre- and post-execution section with the flight participants until arriving at mission execution. Mission execution review is designed to focus the debrief so each person can improve for the next mission. This does not mean each person gets individually debriefed, but rather that those who made errors most impactful to mission success or failure have those errors identified and corrected in a way everyone can learn from them.
All participants should leave understanding how to better execute the mission. The succinct, direct nature of fighter debrief is equally applicable to the MDO community. An additional key to ensuring efficient and effective debrief is withholding personal stakes and ensuring rank does not impede instruction for correctable mistakes.
Debrief attendees should behave professionally, and critiques of execution should not be personal in nature, nor taken personally by flight participants. Aircrew must avoid defensive attitudes and cannot make excuses for poor performance. To this end, mission reconstruction should focus on facts, so instructional fixes can be objective corrections to demonstrated errors. If crews take debrief points personally, or if pride stands in the way of learning, valuable lessons are lost.
The individual running the debrief sets rules of engagement (ROE), which are designed to help avoid pride issues. ROE can vary depending on the squadron and the person in charge of the debrief. Here we provide an example of debrief ROE, developed over several years of flying fighter aircraft. Although not all-inclusive, it provides a good starting point.
Different communities have passed down similar rules throughout the years, and everyone has their favorite—or most important—rule. Some rules offer great insight into a portion of the debrief process. They lay an initial foundation that helps underpin the essence of the debrief: an investigation into the errors made.
The overall goal is to show the facts of what occurred in order to ascertain, prove, and teach the fix i.e., a “lesson learned” for everyone to internalize from the debrief. This type of debrief is only possible in the limited time available if everyone is honest about mistakes and is ready to learn.
Another critical facet to making this type of debrief possible is careful selection of who runs each debrief. It is important to develop a community standard. As a general rule, whoever established the desired learning objectives (DLOs) which drive the mission objectives should run the debrief. This is usually the same person who prepared the mission and gave the briefing.
Especially important is the maxim that rank has nothing to do with who runs the debrief. The squadron commander—or the wing commander—may be in the formation, but the day’s lead or instructor is the most appropriate to lead assessment of facts and fixes. So there is no rank in the debrief. Per the ROE, this does not mean one can say whatever he/she wants. Always remain professional. This helps establish a respectful balance, while taking advantage of the reality that learning can come from anyone, regardless of rank.
The mission analysis process assesses accomplishment of the DLOs. If a formation fails to accomplish a specific DLO, the process then identifies the errors that led to the failure. These errors become DFPs or learning points (LPs), the former having a more significant impact on mission success than the latter. Once the debriefer identifies the DFP(s) or LP(s), he/she categorizes it/them into one of three areas: perception, decision, or execution. After error-categorization, the debriefer then provides an instructional fix to maximize learning and to ensure those present can make a tangible change or correction for future missions.
Combining a DFP with an instructional fix results in a lesson learned—the critical element to community improvement. DFPs and LPs should be the focal point of the debrief because they distill vast amounts of data into concise and effective lessons for each participant. If the debriefer does not identify the DFP or LP, untargeted analysis of the minutia can subjugate debrief focus, and those listening can lose interest or get confused.
A debriefer identifying every minor error someone makes might not only waste valuable time, it can also serve to browbeat an individual, often leading to shutdown and an inability to actually learn. Instead, DFPs developed from the DLOs give the debrief focus.
The debriefer identifies the DFPs during the reconstruction portion of the debrief. Whereas DFPs are failures in mission or tactical objectives (i.e., DLOs), learning points are when the formation accomplishes the DLO in spite of significant mistakes, or in a non-traditional way (e.g., the formation was able to complete the mission but made significant errors that can be debriefed). Learning can come from successes, using LPs, or from failures, identifying LPs and/or DFPs. In any of these three cases, the DFPs and/or LPs provide a common reference point and keep the debrief focused and succinct.
While the fighter community uses the mantra “Plan, Products, Brief, Admin, Tactical Admin, and Execution” to ensure all portions of the mission are addressed, another simple process applicable to any type of event is the five questions to outline an easy-to-remember checklist to guide debriefs:
1. What happened?
2. What went right?
3. What went wrong?
5. What are the Lessons Learned?
Step one: “What happened” is the process of validating the mission and tactical objectives. In other words, did the flight accomplish the DLOs?
Step two: “What went right” is an important part of the debrief process for two reasons. First, a debrief should not be just negative; and second, it is always good to use this step to show the group how things are supposed to look—it is motivating, reinforces good habits, and gives people something to replicate. Additionally, sometimes optimal execution is accomplished without recognition or by unintentional action, and should be highlighted to ensure understanding for application in the future.
Steps three and four: “What went wrong” and “why” is where the debrief loop, is utilized. Step three is not merely focused on “who made the mistake.” Similarly, step four is “why” not “who.” Referencing the aforementioned debrief ROE, do not make the debrief personal.
Step five: “What are the lessons learned” relates back to DFP and LP development; however, this discussion should be carried further, to incorporate lessons learned into the next execution cycle’s planning process. This process allows a wider group of people to learn from the debrief, growing the community as a whole.
Determining why the error occurred is a vital part of debrief and is where most debriefers have trouble. The tendency is to make an assumption on why someone made an error and then give them a fix to that assumption. However, when the person running the debrief utilizes the third step of the debrief loop correctly, he/she asks direct questions of the person who made the mistake to get to the “why” of the error. This is where it is important that all participants of a debrief adhere to rules four and five of the Debrief ROE.
When determining the “why,” the debrief loop recommends the use of the P/D/E model—Perception, Decision, and Execution. Using this model, the debriefer asks the correct questions to accurately determine the “why.” The person running the debrief should ask questions which categorize the error in perception, decision, or execution and then use that information to deliver an instructional fix (IF). An IF should be easy to follow and easy to implement in future missions.
Debrief for the MDO Community
While certain communities within the Air Force utilize very effective debrief methodologies, none of these directly address operations or planning in the MDO environment. There will be an initial hurdle of developing an accepted debrief standard for the MDO community, as it is built out of a diverse pool from around the DoD.
Many people may not be familiar with the previously described “fighter” debrief style, or may find the direct feedback too personal in nature, and some may misconstrue the feedback as an official report instead of seeing it as a simply a way to improve future efforts. These differences in backgrounds, and in conceptions of feedback, make it even more important for the MDO community to establish a standard for debrief.
In conjunction with introducing the MDO community to the debrief process and etiquette, the MDO community would also benefit from identifying mission areas most appropriate to apply the debrief process. Five areas from the planning and execution stages are regularly occurring processes ripe for iterative learning, application of debrief methodology, and ultimately result in a reduction in execution errors.
When the MDO community formally develops a debrief methodology it is recommended that the following five areas be reviewed. These areas are not the answer to how to develop a debrief, but are instead intended to be ideas that spark discussion and drive development in the MDO community.
The first area the MDO community could benefit from debriefing is planning process assumptions. It does not matter if the planning process is for a wargame, for a staff-level task, or for an MDO mission. When executing the planning process, it is important to identify the assumptions made about the task at hand.
Assumptions allow the team to maintain forward progress by focusing effort, but they also have varying degrees of inherent risk. This risk is dependent on multiple factors, including how the assumption was derived, the confidence level of the assessment, and the gravity of the consequences if the assumption turns out to be partially—or entirely—invalid.
It is imperative to document these assumptions for all to see and for the team to periodically revisit. Putting them on a white board in the room is a great technique to enable constant review, and to allow mission partners or—late arrivals—to catch up to the group.
Listing assumptions in plain view has the additional benefit of ensuring all participants can read, validate, or (in some cases) challenge an assumption during the planning process. If a late arrival or the commander is to highlight an invalid assumption, the team can make immediate and early adjustments to the scope and scale of the planning.
However, if an assumption is invalid and not caught it can have an effect on the overall mission, and could result in a failure to accomplish a tactical objective. In this case the team should treat it like a DFP: “Why was assumption incorrect and how did that effect the overall outcome of the planning process?”
Additionally, when the planning team arrives at the end of their process and briefs the plan, avoid assuming that, if the commander selected the planners’ recommendation, the assumptions were correct. Assumption validation occurs as execution unfolds and those assumptions prove valid or invalid in real-time.
Because of this reality, it is best to validate assumptions after execution and capture the results of the debrief for future planning efforts. While some assumptions will ultimately be affected by enemy decision-making, a formal debrief will identify those factors the planning team could have predicted in the planning phase. It may also have the capacity to identify whether planners were cognizant of the risks to assumptions depending on enemy decisions, which should have been a significant factor in contingency planning.
Risk is a second area in which to apply the debrief process, as risk is vital to commanders at all levels. To facilitate this type of debrief, risk should be categorized into risk to mission failure, risk to force, and risk to timing and tempo. The risk involved with a decision is a large assumption made during the planning process.
Comparing planners’ acceptable risk to the risk the commander wants mitigated can be an additional factor to debrief. LP 1: “Why did the planning team assume a higher risk than the commander was willing to accept?” Once developed, these risk lessons can be fed into the planning cycle to inform better future risk mitigation. Risk is not the same in every scenario, and every commander’s risk tolerance is not the same, but understanding allowable risk in a complex environment is a great place to debrief.
After Wargame Execution
A third area where the debrief methodology would be appropriate is following wargame execution. Due to the time and monetary investment required to correctly execute a wargame, it is vital to execute the wargame process as correctly and effectively as possible. When developing courses of action for the commander, the MDO community can use wargames as a way to identify modifications or allow the commander to select the best course of action.
Executing a debrief at the end of the wargame can identify lessons learned for blue mission planners, and can ensure all participants leave with a shared, clear understanding of the outcome. This helps to prove what modifications to the plan are necessary. Since the red team has immersed itself in the enemy’s decision-making process, the red team should utilize the five questions to provide details to the blue team for their use in executing the debrief loop.
A fourth area for the MDO community to leverage the debrief methodology is during flexible deterrent option (FDO) and strategic response option (SRO) development.
The MDO planning cycle can be time-consuming, as it consists of developing observed and desired systems, executing center of gravity and decisive point analysis, building a logic map, and filling out a decision support matrix, a decision support template, and a synchronization matrix to build the SRO. It may take several months to validate an SRO and, therefore, delay feedback to the planners, meaning lessons are potentially lost over time.
By adopting a debrief culture, the MDO community could generate lessons learned during the process and incorporate them into the current and future planning cycle therefore reducing errors and increasing effectiveness across the entire community.
Air Operations Exercise
The final area the MDO community could utilize a community-wide debrief methodology is during exercises at the Air Operations Center (AOC) level. The tendency is to run the exercise, execute a 3 up and 3 down slide, and then return to standard business. The 3 up and 3 down debriefs only highlight 3 positives and 3 negatives from the entire exercise. This type of wave-top after action assessment does not maximize the learning and growth that can come from this type of exercise.
Executing a robust exercise at the MDO level requires a great deal of time, effort, and resources. Therefore, it deserves a debrief methodology to ensure the lessons learned are fully captured. There are many ways to accomplish this, whether at the completion of each air tasking order (ATO) day, or at the completion of the entire exercise.
Establishing a standard that facilitates root cause analysis and open discussion of errors among key participants is crucial in moving the MDO community forward. Preventing recurring mistakes in the five recommended areas is the ultimate benefit of a well-developed debrief process. This is why it is important for the MDO community to develop its debrief methodology (with appropriate ROEs) and find applicable areas in the community where it should be applied.
1. An 80-percent solution delivered on time is almost always better than a 100-percent solution delivered too late.
There are often solutions within authority to direct, but often they are not pursued because they were not “ideal” solutions. Ask for outside help if you think you need it, but meanwhile focus on measures that are within your control. We rarely have the luxury of finishing our to-do lists, or even the necessary and critical portion of our to-do lists. We certainly don’t get to those tasks designated for “when I have some free time”. It is a reality that we often must complete as much as we can as best we can in order to survive.
2. Just because you aren’t an expert doesn’t mean you can’t evaluate the quality of data going into your decision.
Probability data the action could be fairly good, so your team could fairly accurately predict outcomes. Statistics make it possible for us to make fairly accurate predictions with small groups of data. It is not possible to predict individual events but statistics will give insight to the overall results. Statistics let us make estimates about the future without knowing all the possible results.
3. Sometimes data could not known at critical time
The are times when you don’t know how severe an adverse outcome could be. Had data on which predictions were made been reliable, the frenetic nature of actions might be easier to understand. Decision-analytic measures should be reported if the predictive model is to be used for making critical decisions. Other measures of performance may be warranted in specific applications, such as reclassification metrics to gain insight into the value of adding a novel predictor to an established model.
4. Properly inform and properly engage your chain of command.
Limiting info to only one community instead of fully involving operational chain of command, failed to use proper methods to communicate the severity of your concern to operational leaders, was inconsistent in communicating the degree of concern depending on who you were speaking to, and may have not used the right channels to make operational recommendations. as a matter of expedience, but sometimes this approach does nothing to accelerate a response, and could have quite the opposite effect.
5.. Don’t presume you know more than you do about what’s going on outside your command.
When a leader asks for help, supporting commands will spin up, but they have to judge how much information is enough to update without overwhelming the supported command. Sometimes they will judge wrong and provide less information than you would like. It is fine for the supported command to validate progress, but you should not assume you know more than you actually do. Before you ever represent that support is insufficient, make sure you are absolutely certain you have a good picture of what is going on outside your command.
6. Supporting staff is just that: supporting staff.
You are the leader, and the fact you may have received bad advice from supporting staff will not protect you, nor should it. Challenge assumptions. Cross-check supposed “facts.” Make sure the information you are basing your decision on is correct. It is your responsibility.
7. Expect worse case scenario where everything will leak.
Your ship’s staff could threaten to leak a letter. In the modern world, assume everything on the unclassified network, and too much of what exists on the classified network, will be released far beyond what was intended. Whatever you write, assume it will get out, and play out scenarios of what will happen when it does, before you hit “send.
8. Another danger: decision by mob.”
The tool of “creating a false narrative then get the outside populations to amplify that false narrative” has existed for some time, but the military was thought to be somewhat insulated from it. No longer. There is extreme risk that ill-informed, or well-informed but malign, media forces will intentionally or inadvertently drive a decision in the wrong direction. This is a matter that must be understood and dealt with at all levels of command, and the higher up you get, the more critical the response likely will be.
9. Panicked activity never helps.
In a crisis, there will always be a temptation to “do something.” That “something” must derive from reason and logic. Yet, in some cases, that reasoned analysis had been overcome by frenetic activity directed at getting “the machine” to move faster on a very challenging course of action. The more serious an event is, the more important it is to slow down and think.
10. Be careful when suggesting a course of action that could shift risk from a military population to a civilian one.
When outside leaders push back energetically, sometimes issues are not given the attention deserved, instead dismissed as a “political” problem and fail to pursue a course of action that considers the entire spectrum of risk,