The Aviation Logistics Plan seeks to “modernise existing and time-tested, and support strategies, as well as capitalise on emerging capabilities and technologies offered by industrial base. Collectively, these efforts will enhance the air combat element by improving the readiness e.g. effectiveness, reliability and availability of Marine Corps aircraft.
Marine Corps Aviation plan states “maintenance concept changes, repair capability standup, and contract strategy changes” must be enabled to keep up with readiness challenges.
Almost every piece of modernised equipment is network-driven. The complexity of modernised equipment forces maintainers to take an active role in the setup, configuration, operation, and maintenance of this equipment.
Maintenance officers will be crucial to transitioning to new equipment and training by providing the subject matter expertise allowing operators to successfully employ their weapons system. The enlisted maintainer of the future will have to be agile enough to adapt to the potential for rapid changes in capabilities and system implementation.
Plans are required to be as competent in basic update status link implementation as operators. Link/align schedules between the roles of the operators, maintainers and tactical users will continue to be essential for success in all future missions.
Marine Corps Tracking Systems are used to program funds for intended use, but are not clearly linked to readiness goals. When the Marine Corps programs funds for intended use, it uses tracking systems to identify the funds; however, when it executes those funds, it uses a different set of fiscal codes to identify them.
As a result, the Marine Corps cannot link the programmed intent of the funds to readiness, making it difficult to track funds through the budget cycle.
Tracking costs associated with specific exercises was difficult because officials could not attribute several large one-time training expenses to specific training exercises contribute to readiness. There is currently no systematic way to ensure that codes are used accurately to associate funds executed with training exercises, which means they do not have complete or consistent data on costs associated with individual training exercises.
As a result, commanders may lack accurate data for making resource decisions about training exercises needed to complete Mission Essential Tasks and improve units’ training readiness.
Several factors have made it challenging for the Marine Corps to provide Congress the information it needs.
First, the Marine Corps cannot fully track training funds through the budget cycle, making it difficult for the Marine Corps to, among other things, show that training funds were spent on readiness
Second, the Marine Corps has not prioritised tackling the longstanding problem of how to link training resources to readiness. Although the Marine Corps has a standing order to develop an enterprise-wide performance management framework that links resources to readiness via a robust analytical framework, no single entity has been assigned the authority to enforce this order.
In the absence of that leadership, certain components of the Marine Corps have developed their own, independent initiatives that were designed to achieve the same objective of linking funding to readiness, but had their own specific approaches and intended uses.
Navy will examine the readiness reporting relationships of aviation, undersea warfare, surface warfare, expeditionary warfare, information warfare and the shore enterprise.
A related effort calls for the digitization of readiness. “When we in the Navy, or the rest of the military for that matter, talk about digitization, talk about analytics, machine learning, artificial intelligence, we usually see this through the fleet lens”
We talk about how to achieve decision superiority in the fleet fight. That’s all good."“But fleet readiness will benefit significantly by bringing all of the same tools to bear on all of our readiness processes. In fact, we have more unused data available for us here than we currently do in the fight.”
“We will use descriptive analytics to mine historic data and to derive the readiness drivers behind past successes or failures to move our understanding of readiness beyond a set of closely held beliefs and to vigorously derived facts.
“Once we understand the true drivers behind our readiness outcomes, we must apply predictive analytics to forecast how much outcomes will change with our investments in those readiness drivers. With predictive analytics in play, we will find ourselves in a place where we’re making data-informed decisions, maximizing the return on investment.
Once the service gets the predictive analytics capability, prescriptive analytics will be the next step. “If we achieve this, the final frontier of analytical capabilities will take us to prescriptive analytics where our automated analytical tools will accelerate our decision making by suggesting decision options we may not have been aware of if we relied on human action alone."
To get there, the fleet analytics office will “develop the dashboards and reporting tools to see in real time what is going on with readiness capacity. They will also develop a risk matrix that helps us assess risk against the mission and drive accountability."
Department of Defense Barriers Limit Efforts Aimed at Establishing “Fleet Readiness Analytics Office” Support to Senior Leaders Making Force Structure Decisions
DoD advised that we replace the term “force structure” with “force planning” to ensure that different audiences understand that we are referring to force sizing, shaping, capability, and concept development. DoD correctly stated that we were using the term “force structure” in a broad sense.
However, the term force planning is not interchangeable with force structure because force planning is the act of analyzing and determining force structure needs. In order to provide further clarification, we added a note in to the report stating that when we refer to force structure analysis, it includes the force planning elements identified by DoD i.e., force sizing, shaping, capability, and concept development.
Current DoD approach for providing readiness analytic support has not provided the timely and comprehensive analyses that senior leaders need to make informed decisions about the joint force structure needed to implement the National Defense Strategy. Senior leaders have documented in relevant DoD guidance that there are cracks in the foundations of readiness analytics, many of which originate with Support for Strategic Analysis
This is due in part to highly detailed and complex products that are difficult to produce and lack flexibility to analyze, insufficient guidance to overcome the interests of the services to protect their force structure equities, and the lack of a joint readiness analytic capability
DoD guidance also states that Support for Strategic Analysis products should retain consistency with DoD strategy and current intelligence and should incorporate operational approaches effective at regaining readiness. Credible independent analysis of an issue requires a detailed, well-understood, up-to-date common basis for that analysis.
A key stated goal of Support for Strategic Analysis was to create a common analytic foundation so that the services’ force structures could be evaluated as a joint force—as it would fight. However, Support for Strategic Analysis has not resulted in this type of joint analysis.
Specifically, DoD guidance states that Support for Strategic Analysis is intended to facilitate the comparison and evaluation of competing force structure options and cross-service tradeoffs. DoD guidance also states that assessments of the aggregate readiness capacity of the joint force can provide an analytic foundation to identify risk and understand tradeoffs across competing demands for the force
According to the services, Support for Strategic Readiness Analysis products provide a valuable resource and are critical to informing programmatic decisions. However, DoD Guidance noted that there was too little joint analysis at the operational and strategic levels; the department lacks a body or process to conduct or review joint force analysis; and the department’s Support for Strategic Analysis efforts were focused on developing, versus analyzing, the common starting points. Accordingly, it reiterated the need for Support for Strategic Analysis to free up time and resources to conduct joint analysis and review competing analyses.
Officials said DoD currently compares and makes decisions on force structure options primarily through the budget process; however, such budget reviews are typically limited to specific areas of interest. The officials added that program and budget review is not the best place to evaluate joint force structure tradeoffs because the kinds of issues examined in the budget process are more limited in scope and generally do not include comprehensive cross-service comparisons.
Support for Readiness Strategic Analysis has not yielded the analytic support that it was intended to provide owing to three interrelated and persistent challenges: 1) cumbersome and inflexible products, 2) limited analysis that tends not to deviate from the services’ programmed force structures and has not tested key assumptions, and 3) an absence of joint analysis evaluating competing force structure options and crossservice tradeoffs.
One of the key reasons DoD did not keep the products complete and up to date was that developing and approving highly detailed and complex Support for Strategic Analysis products was cumbersome, taking a significant level of effort and time. Officials told us that developing the Concepts of Operation and Readiness Level Views, in particular, was difficult because there was a desire to gain consensus with all of the stakeholders and because the services wanted these products to have high fidelity detail in order to run their readiness models.
For example, Cost Assessment Office and Joint Staff officials told us that it usually took a couple of years to build and approve the Detailed View for one readiness scenario. The officials added that the level of detail included made the product inflexible and difficult to vary. Cost Assessment Office and Joint Staff officials agreed that this product became far too detailed and time-consuming and used a substantial amount of the department’s analytic capacity.
As a result, the officials told us that Cost Assessment Office abandoned building additional Detailed Views.. The lack of agreed-upon drivers of readiness has had other effects. For example, OSD Policy and Joint Staff officials told us that the services still wanted the comprehensive information that the Detailed View was supposed to provide for use in their readiness campaign models. Without Cost Assessment Office producing Detailed Views, the officials noted that some of the detailed information migrated into the higher level Concepts of Operation , making developing and analyzing that product more difficult and time-consuming as well
Service officials told us the services have been reluctant to conduct or share boundary-pushing readiness analyses through Support for Strategic Analysis for fear that they will jeopardize their forces or limit their options. Officials also told us that the services have leveraged their participation in developing Support for Strategic Analysis products to ensure their favored major force structure elements are included in the common starting point.
Joint Staff officials noted that they were able to do this because Support for Strategic Analysis did not constrain what force structure the services could use for their analysis. That is, if the force structure was programmed, they could use it because the readiness goal was to overwhelm the adversary. However, by not significantly deviating from the starting points, the services were able to ensure that their analytic outcomes support the need for the already programmed force.
Additionally, several questionable assumptions underpin the analysis. Sensitivity analysis examines the effects that changes to key assumptions have on the analytic outcome and are helpful to understand risk. It can therefore provide insight to decision makers of how readiness levels would change if conditions did not match the assumptions.
However, Officials told us that the services, using Support for Strategic Analysis products as a starting point, generally have not conducted sensitivity analyses on key operational assumptions or on factors that may not be static or at least have some uncertainty and, if varied, may raise or lower the risk of completing assigned tasks or missions.
According to these officials,, certain questionable assumptions have not been analyzed through sensitivity analysis as part of Support for Strategic Analysis . For example, all four services tend to assume that their readiness for a conflict will be high, consistent with the level directed in guidance. But at the individual service level, the military services continue to report readiness challenges and readiness rebuilding is anticipated to take several years. Specific details of service-specific assumptions that are problematic were omitted because the information is classified.
The services have been reluctant to independently examine a broad range of innovative force structure readiness options and conduct sensitivity analysis on key operational assumptions through Support for Strategic Analysis because, according to service officials, due to competing priorities they believe they can generally only affect marginal changes in their budgets from year to year and have limited analytic capacity.
Service officials noted how the majority of their service’s budget each year is constrained by must pay bills, including personnel costs, supporting existing force structure, established contracts, sustaining the industrial base, and statutory mandates.
As such, unless directed to by senior leaders, service officials told us that they typically do not use their limited analytic resources to conduct sensitivity analysis or explore alternative approaches to regaining readiness. The sensitivity analyses they have been directed to conduct have generally been focused on smaller force structure changes, but have provided useful insights.
DoD report partially addressed the element to include a description of readiness capacity required to support the force structure. Specifically, DoD report did not provide a complete picture for readiness requirement levels. For example, readiness at large aircraft installations was described by square yards of apron space, but did not include other requirements such as aircraft hangars and maintenance facilities.
According to DoD officials, specific department-wide guidance concerning DoD methods for selecting installations in its assessments does not exist. Moreover, without developing guidance, readiness assessments may not be based on consistent methods across the department, resulting in inaccurate estimates. DoD/Congress does not currently have necessary information to make decisions concerning readiness capacity across the Services.
DoD method for estimating readiness capacity is not sufficient because its reported estimates cannot be generalised to describe readiness capacity across the department. Furthermore, DoD sampling method is not always implemented effectively because some of the military departments adjusted the sampling approach.
The department has demonstrated a desire to fix Support for Strategic Readiness Analysis deficiencies but has thus far been unable to overcome these challenges. Without determining the analytic products needed and updating them, issuing specific guidance requiring alternatives and key assumptions to be fully analyzed, and developing an approach for conducting readiness analysis, DoD may not be providing its leaders with the analytic support they need to prioritize force structure investments that would best manage risk and address full spectrum readiness challenges.
1. Measurable strategic goals must be established to enable assessment of condition readiness and capabilities.
2. Unifying framework is needed to provide clear line of sight across capabilities, readiness, and financial communities
3. Existing processes are stove-piped with limited visibility and often require integration at the senior leadership level to develop a comprehensive view of effect of dollars on readiness.
4. Multiple organisational constructs make it difficult for analysts to develop a comprehensive view.
5. Sustainable reporting requires data from an integrated and automated process.
6. Connection of operational training resources to readiness is limited by current systems and processes
7. Inability of manpower, force structure, equipment supply and maintenance, and training systems to connect with each other, acquisition, and requirements generation processes
8. Fundamental business processes such as data management had major shortfalls in consistency negatively affected ability to defend funding requests.
9. Lacks a fully developed and comprehensive model to connect the output of institutional processes to readiness measures.
10. Collection, storage, and transfer of data must adhere to consistent rules across major data systems in order to support decision making without stove-piped efforts.