Marine  Magnet Dispatch Service Centre
  • Drone Fleet Readiness Office
  • Status Updates
  • Marine Corps Integration
  • Submit Comments
  • Download Reports
  • Building Drone Swarms

Top 10 Artificial Intelligence Strategy Focus On Operations Fielding Increase Capabilities With Tasks

4/20/2020

1 Comment

 

​The Defense Department has a new artificial intelligence strategy emphasises the creation and tailoring of tools for specific commands and service branches, allowing them to move into AI operations sooner rather than later.

This  strategy reflects an additional imperative, which is to translate the technology into decisions and impact in operations,”  Pentagon is using artificial intelligence to sorting through intelligence footage. Project has been described “a pathfinder” but the strategy is much broader than one project.”

There’s a Big Obstacle to the Pentagon’s New Strategy to Speed AI to Troops. Officials want to accelerate the delivery of artificial-intelligence tools from the lab to the field., but it's hard to obtain the massive data streams that make AI work. 

DoD will develop AI tools and programs to assist with everything the Pentagon does to include combat operations, Near-term projects include efforts to predict better spot network deviations that can indicate predictive maintenance.

Defense Innovation Unit is aiming to better predict and accelerate repairs for Bradley Fighting Vehicles. The company is building a virtual Bradley, using data streams from sensors on real Bradleys in the field.

 Digital Twins enable us to collect what the best-performing Bradley would look like because we are able to go into many of the subsystems and pull the data. Just from a single vehicle, we were able to pull terabytes of data.

We aim to use industry data from external sensors, and use it to digitally recreate the vehicles’ operating environments. It’s a process that could be relevant to larger military endeavors, such as the effort to design a new combat vehicle, 

If we prove the value here, imagine what we can do in that environment as we build that system out?  There’s so much to learn from all the unused data in these industries; less than one percent of the data is ever actually used.”

. “It’s better to get in all the data right now while you’re designing the next generation of vehicles,” Massive amounts of diagnostic data plays the essential role. But finding that data isn’t easy.

The main thing that the Defense Department brings to the process isn’t the  machine-learning rules and methodologies. Those are increasingly coming from industry. It’s the unique data sets. So if getting that data is a problem, then DoD has a major obstacle in terms of realising its goals.

How will contractors get the data that they need if DoD is so bad at collecting and keeping it internally? Who owns a datastream that combines industry and DoD data? Getting DoD to change how it collects and makes data available on systems like aircraft is less than totally straightforward. Another challenge is getting Leaders to actually accept AI-generated recommendations.

“All of the processes and procedures need to change. When we get a prediction now that says, this aircraft battery, the data shows you were parked in an environment with unique conditions like cold and you were there for years. You don’t have to swap out this battery on a regular cycle like you’re doing,’ how is policy/guidance and procedures changing to allow that to occur? That is our big question for DoD. We haven’t solved that. 

Artificial intelligence is code entrusted to reason through independent choices. The decisions themselves are the result of coded paths and inputs, feeding data into different parts of algorithms, weighting outcomes, and then creating an end product that is designed to be useful to the humans that consume it.

There are degrees and worlds in AI, a vast space from deterministic and emergent behaviour, from online machine learning to targeting tools, and it is in that complexity that care is most required, that human direction is most desired in form and function, that is lacking in the new AI strategy. 

Nowhere is the how or the what of AI spelled out. DoD wants to make sure investment continues in a permissive environment, but the term is an umbrella, a catch-all, with no specificity as to what it does, or why it might require efforts to train new workers.

Failure to adopt AI will result in legacy systems irrelevant to defense, eroding cohesion among  partners and reduced access to markets. “We are launching a set of initiatives to incorporate AI rapidly, iteratively, and responsibly to enhance military decision-making and operations across key mission areas. 

Examples include improving situational awareness and decision-making, increasing the safety of operating equipment, implementing predictive maintenance and supply, and streamlining business processes. We will prioritise the fielding of AI systems that augment the capabilities of our personnel by offloading tedious cognitive or physical tasks and introducing new ways of working.”

AI has the potential to enhance the safety of operating aircraft, ships, and vehicles in complex, rapidly changing situations by alerting operators to hidden dangers. Implementing AI in  supply and maintenance operations can predict the failure of critical parts, automate diagnostics, and plan maintenance based on data and equipment condition. Similar technology will be used to guide provisioning of spare parts and optimize inventory levels. These advances will ensure appropriate inventory levels, assist in troubleshooting, and enable more rapidly deployable and adaptable forces at reduced cost. 

Streamlining business processes. AI will be used to streamline business operation with the objective of reducing the time spent on highly manual, repetitive, and frequent tasks. By enabling humans to supervise automated tasks, AI has the potential to reduce the number and costs of mistakes, increase throughput and agility, and promote the allocation of DoD resources to higher-value activities and emerging mission priorities.

Artificial intelligence is about generating predictions. “ It takes information/data you have and uses it to generate information you don’t have.” In the past, collecting and parsing data, constructing models, and employing the resident statistical expertise to offer intelligible interpretations demanded significant resources..

Unlike guesses, predictions require data. More data provide more opportunities to discover critical linkages, generating better predictions. In the past, analytic techniques constrained the amount of data that could be scoured for correlations. Consequently, these techniques relied on an analyst’s intuition and they functioned only as an average, potentially never actually yielding a correct answer.

Not so with modern techniques in artificial intelligence, techniques feasting on the immense data sets and complex interactions that would otherwise overwhelm classic statistical models/ Without data the machine of artificial intlligence would grind to a halt.

But not all data are created equal. Data must be tailored to the task at hand. Asking an artificial intelligence to predict whether the pixels in an image can’t necessarily help when trying to predict if another group of pixels in another image correspond. 

Sometimes the prediction problem is over simplified. For example, in  autonomous driving, there is only a single necessary prediction identified:  “What would a solider do?” While framing the problem this way may help an engineer move beyond a rules-based programming decision tree, to be relevant the prediction demands additional nuance. 

For example, “What would a soldier do if a truck pulled out in front of him or her?” Only then can the data be searched for similar situations to generate a usable prediction. Without the nuance, the data collected is not applicable to soldiers driving their tanks on a battlefield.

Not only are data specific to the prediction, but the problems to which we can apply artificial intelligence are also situation-specific. There are “ known knowns“, “known unknowns“, “unknown unknowns“, and “unknown knowns.”  

“Known knowns” represent a sweet spot for artificial intelligence—the data are rich and we are confident in the predictions. In contrast, neither known unknowns nor unknown unknowns are suitable for artificial intelligence. In the former, there are insufficient data to generate a prediction—perhaps the event is too rare, as may often be the case for military planning and deliberations. In the latter, the requirement for a prediction isn’t even specified. In the case off “unknown knowns“, the data may be plentiful and we may be confident in the prediction, but the answer can be very wrong due to unrecognised gaps in the data set, such as omitted variables and counterfactuals that can contribute to problems of reverse causality.

Current artificial intelligence prediction machines represent “point solutions  optimised for “known known” situations with plentiful data relevant to specific, understood work flows. To understand how an artificial intelligence tool may function within a specific workflow, a “canvas” can help decompose tasks in order to understand the potential role of a prediction machine,” the importance and availability of data to support it, and the desired outcome.

The most important element of the artificial intelligence canvas is the core prediction. Its identification and accurate specification for the task-at-hand are essential. Otherwise, the entire artificial intelligence strategy can be derailed.

The tools of artificial intelligence rely on available data to generate a prediction. We identify three types of necessary data: training, input, and feedback. The tool is developed using training data and fed input data to generate its prediction. Feedback data from the generated prediction are then used to further improve the process.

More and richer training data generally contribute to better predictions, but collecting data can be resource intensive, constraining the data available for initial training. Feedback data fill the gap, allowing the prediction machine to continue learning. But that feedback data must come from use in the real world. 

Consequently, the predictions or artificial intelligence are more likely to be wrong when the tool is first fielded. “Determining what constitutes good enough for initial release is a critical decision. What is the acceptable error rate, and who makes that determination?

Even if data are plentiful and the algorithm refined, if data are flawed the predictions will still be incorrect. Additionally, it’s important to remember that all data are vulnerable to manipulation, which would significantly degrade the tools of artificial intelligence. 

For example, feeding corrupt input data into a prediction machine could crash an artificial intelligence tool. Alternatively, the input data could be subtly altered such that an artificial intelligence tool will continue to function while generating bad predictions.

Feedback data can be manipulated to alter the performance of an artificial intelligence tool,. Training data introduce their own vulnerabilities into artificial intelligence—an adversary can interrogate the algorithm, bombarding it with input data while monitoring the output in order to reverse-engineer the prediction machine. Once the inner workings are understood, the tool becomes susceptible to additional manipulation.

Detecting flawed predictions, either due to inadequate learning or adversarial data manipulation, poses a significant challenge. It’s impossible to open the “black box” of an artificial intelligence and identify “what causes what.”

While DoD is trying to resolve this shortcoming, presently the only way to validate whether the predictions are accurate is to study the generated predictions. Must test for flawed predictions and hidden biases, and then feeding select input data into the prediction machine to test the hypothesis.

However, since “we are most likely to deploy prediction machines in situations where prediction is hard,  testing of these complex predictions may prove exceptionally difficult. This challenge may be further exacerbated in military-specific scenarios.

Predictions are but an input into eventual decisions and associated actions. For example, you could estimate the likelihood of your car breaking down in the next six months, or an unexpected relocation, but these predictions may not alter your decision to purchase a new car, because they don’t determine the value you have assigned to the outcome of driving a new car. The process of assigning that value—the associated reward or payoff—is a distinctly human judgement, and one that varies among individuals.

In the past, these prediction and judgement inputs into our decisions were obscured because we often performed both simultaneously in our head. However, the outsourcing of the prediction function to the new tools of artificial intelligence forces us to “to examine the anatomy of a decision” and acknowledge the distinction.

Occasionally, an appropriate payoff based on an prediction generated by artificial intelligence can be predetermined and the resulting decision coded into the machine. In these cases, because the prediction dictates the decision, the task itself is prime for automation. 

But more often, situations are complex and prediction is hard. These are the situations where we are most likely to introduce prediction machines, and the residual uncertainty of the prediction can actually necessitate greater human judgment because the prediction, even if generated through artificial intelligence, may not always be correct. So instead of eliminating the human, artificial intelligence often places an even greater imperative on the human remaining within the system. 

The human position within the system and relationship to the task will likely change. . Once the prediction function is automated and assigned to an artificial intelligence, tasks previously deemed essentially human and their associated skills will likely be superseded by new tasks and new skills. 

The  key policy question for DoD to solve isn’t about whether AI will bring benefits but about how those benefits will be distributed.” As these tools become more prevalent, individuals will have to learn new skills, and in the process income inequality may be temporarily exacerbated. 

Our current and near-future artificial intelligence tools are suspect. Give them a problem and data for which they are trained, and they will perform remarkably; give them a problem for which they are ill-equipped, and they will fail. It doesn’t matter if the tool is designed for business or national defense.

Too often, artificial intelligence is portrayed as magic to be applied to all our most challenging problems. Prediction Machines provides a compelling, fresh perspective to help us understand what artificial intelligence is and its potential impact on our world. The text is essential reading for those grappling to make sense of the field.

Artificial intelligence is simply a prediction machine—it uses information we possess to generate information we do not possess. This simple realisation immediately refocuses  discussions and guides useful development of artificial intelligence. It underscores the situation-specific nature of its data and tools. It discloses its fallibility. 

So it reveals the role of predictions in our decision process, not as determinants but rather as inputs that must be evaluated according to our uniquely human judgement-- the “most significant implication of prediction machines”—they “increase the value of judgement.”

AI applied to difficult tasks such as imagery analysis can extract useful information from raw data and equip leaders with increased situational awareness. AI can generate and help commanders explore new options so that they can select courses of action that best achieve mission outcomes, minimizing risks to deployed forces.

1. Improving situational awareness and decision-making, increasing the safety of operating personnel/equipment, implementing predictive maintenance and supply, and streamlining business processes. 

2. Use AI to predict the failure of critical parts, automate diagnostics, and plan maintenance based on data and equipment condition. Similar technology will be used to guide provisioning of spare parts and optimize inventory levels. These advances will ensure appropriate inventory levels, assist in troubleshooting, and enable more rapidly deployable and adaptable forces at reduced cost.  

3. Streamlining business processes. AI will be used with the objective of reducing the time spent on highly manual, repetitive, and frequent tasks by enabling personnel to supervise automated tasks

4. AI has the potential to reduce the number and costs of mistakes, increase throughput and agility, and promote the allocation of DoD resources to higher-value activities and emerging mission priorities. 

 5. Directive requires realistic and rigorous testing and clear human-machine interface, as well as appropriate training for commanders and operators, so weapons function as anticipated in realistic operational environments against adaptive adversaries. 

6. AI systems have a lower risk of accidents; are more resilient, including to hacking and adversarial spoofing

7. Create new  approaches to testing, evaluation, verification, and validation, and increase our focus on defensive network system platforms as a precondition for secure uses of AI

8. Prioritize fielding of AI systems that augment the capabilities of our personnel by offloading tedious cognitive or physical tasks and introducing new ways of working. 

9. Put in place key building blocks and platforms to scale access to AI include creating  common foundation of shared data, reusable tools, frameworks and standards, and network services. 
​
10. In parallel, take steps to ready existing processes for AI application through digitization and smart automation
1 Comment

Top 10 Artificial Intelligence Advances Readiness Speed With List of Assets Combined to Form Capability

4/20/2020

0 Comments

 
​The capability of close air support is not just about having the right attack aircraft, but also the right munitions, pilots, radios, and trained controllers on the ground or in the air. So having a graph of needed assets can offer leaders a more complete, accurate, and flexible picture of what a military can bring to bear than a simple list.

While this approach allows for faster mission analysis, requires less data and computation, and demands less human judgment in the modeling choices, it also has some serious shortcomings. Because it remains divorced from a broader scenario planning context, an approach built on graphs may fail to include time/distance factors or competing demands for capabilities. 

For example, if a mission set requires two C-5 squadrons for mobility, as long as there are two mission-capable squadrons anywhere in the joint force, it will show as ready. It will not take into account if those squadrons can actually make it to where they are needed in time or if they are tied up on other missions. 

Another key shortcoming of this approach is the limited influence of the adversary. Enemy capabilities are accounted for in historical mission data, so this method predicts future demands based on past performance, and therefore does not include an agent-based or adaptive red force of the future.

Another approach to the basic questions of readiness takes these limitations into account, albeit at the cost of greater time and resources. This second method creates a fuller, more complex picture, by inputting the “as-is” picture of the current status of all assets into a scenario analysis tool that can model the full set of assigned missions. 

Running the scenario tool then allows for variation and testing of how the current force could execute those missions under different conditions. Rather than relying on historical analysis, varying the scenario can determine if a given set of assets truly can do what the mission asks of them, or if other capability mixes can. 

This approach can answer questions like, “Can the C-5s reach the airfield in time?” or “Can the helicopters assigned to the mission fit the raid force’s M327 120mm mortars?” It also allows for multiple scenarios to run against the “as-is” picture of the force concurrently. If a separate missions in different theatres overburden the same resources, then they cannot be effectively executed simultaneously, and these would be areas for potential investment.

Even more importantly, this method allows for agent-based simulation to be combined with the breadth of data and variation that AI can provide, creating the most realistic depiction possible of adversary capabilities and courses of action.

Here, the enemy is not simply a static list of capabilities or doctrinal templates; it can react appropriately to the strategy and tactics being used in the simulation. This aspect of scenario-based tools helps military planners to take into account new tactics or new adversaries on which there may not be much historical data. 

For example, how could the Navy possibly know how to counter emerging technologies like hypersonic missiles or what if a new adversial group? The answer is to fight against them hundreds if not thousands of times digitally before ever meeting them on the battlefield.

Scenario analysis tools form a large part of meeting the National Defense Strategy Commission’s recommendation that the Department of Defense “must use analytic tools that can measure readiness across this broad range of missions, from low-intensity, gray-zone conflicts to protracted, high-intensity fights.” In short, this more detailed approach can not only help the military be ready for the fight today but also set appropriate force posture to be ready for future fights.

However, this method is also computationally intensive. The greater the accuracy desired from a model, the more and more varied types of data that must go into that model. A full-scale scenario model, for example, would require a near real-time picture of the force, meaning actively sensing and sending data on every operating asset. 

That is an incredible amount of data to manage in one system. Beyond those technical challenges, there are philosophical challenges to overcome as well. Even the detailed and accurate model is still a model. As such, it is subject to the flaws and biases in human decision-making, as users make determinations around which models, scenarios, and parameters are most likely. AI tools can come together to meet the basic information needs of readiness.

1. Overcoming the challenges

Military decision-makers will face many challenges when implementing AI into their readiness strategies. These challenges can include but are not limited to: who owns the data; how to validate the data; where and how it is stored; the dependence of high-level simulations on lower-level simulations; the classification of data and outputs; and on what network everything should reside. A combination of general AI practices and custom considerations can help military leaders navigate this tangle of choices and chart a path to a fundamentally new readiness system.

2. Asking the right questions

AI is not magic. As we have seen, different types of AI have different strengths and do different things well, but with corresponding limitations. But real-world problems are rarely encapsulated in discrete, neatly defined questions. They are complex topics with many messy, interrelated issues. Therefore, the first challenge is to discover ways to render a general readiness problem into specific questions suitable for AI without losing fidelity or applicability to the real work problem at hand. This is more of an cultural power challenge versus a technical one. But unless that hard thinking is done up front, any solution generated by AI could be largely irrelevant to the mission problems faced in the real world.

3. The who, what, and where of data

The best starting point when dealing with such significant volumes of data is often going to be the cloud, which allows for a single, extensible repository. Increasingly, cloud providers are also integrating additional AI-enabled services that can speed data validation and other tasks. 

The support available from cloud providers underlines the importance of getting all of the data in the first place. Gathering real-time statuses for every piece of equipment, infrastructure, and service member in the joint force may seem like an impossible task. However, the military may already have much of the data it needs without even knowing it. 

4. Model accuracy

Another common problem for AI adoption is ensuring the accuracy of tools. Even the most advanced AI tools are still tools constructed by humans and, as such, can often mirror the judgments and biases of humans.  Air Force only recently began running predictive maintenance programs on C-5, B-1, and C-130J airframes that had been producing detailed data about aircraft status that went uncollected for years. Identifying and tapping into such existing data sources can jump-start AI-enabled readiness assessments without the need for costly new systems. Previous research has shown that even adoption of transformational technology can often be accomplished by focusing on the existing data that an organization has without the need for new capital investments.

5. Uncover/Eliminate Bias issues with training data

One way to help ensure the desired accuracy of an AI system is to use participatory design, a process that includes a wide array of stakeholders, not just programmers and end-users, in the design process. This can help ensure a variety of perspectives are included in a simulation and that the right performance parameters are selected. In military applications, this can be even more important, because every military decision carries with it an implicit understanding of our own tactics and doctrine. 

Since the enemy does not play by the same rules, to avoid AI tools that are not unintentionally biased toward our own strategies—and therefore predict overly rosy outcomes—it is crucial to include a “red team” dedicated to playing devil’s advocate in the design process.

6. Military-specific challenges

Design and data challenges are common to any organization pursuing a large-scale AI project. However, there will also be some challenges unique to the military that will need to be overcome.

7. Model dependencies

A complex scenario analysis tool is composed of several different models at different levels of detail. Higher-level models are dependent on lower-level models for their accuracy. For example, a force-flow model of fighter jets depends upon lower-level, more detailed models about engine performance and fuel consumption at various altitudes. 

If those lower-level models are wrong, they can result in serious inaccuracies in a simulation, with aircraft flying faster than possible or never running out of fuel, or ground units walking for hundreds of miles without getting tired. In short, higher-level models cannot be accurate without getting the details of lower-level models right first.

To obtain the most accurate baseline models possible may require gathering the technical baseline data on key weapons systems. Readiness personnel should work with their acquisition counterparts to gather or gain access to that information for current systems and ensure that future contracts have access to that information for future systems.

8. Classification management

Perhaps the most closely held military secrets are what a military can and cannot do. So when assessing the readiness of a force against real-world mission sets, naturally, the results are expected to be classified. However, many of the lower-level models may use available data. It is only through aggregating many of these different data points that details of military capabilities and weaknesses are revealed. As a result, classification of such an AI-enabled system needs to be carefully managed to ensure that key vulnerabilities are not accidentally revealed
.
This challenge is compounded when considering that the classification of information will determine which communications network the tools must reside on. The higher the classification, the more difficult it will be to get tools certified to operate on that network. As a result, it is likely that an AI-enabled readiness system would exist on multiple networks, from unclassified to different levels of classification. The system will need procedures and tools for moving data from low-to-high and possibly for releasing appropriately classified data from high-to-low without revealing any important information or introducing vulnerabilities to the higher-classification networks.

9. Future Solutions

AI and cognitive tools may not have the history of the tank or the cachet of the aircraft carrier, but they are undoubtedly important parts of future militaries. Understanding the benefits and common challenges of applying AI to military problems such as readiness can not only improve readiness assessments, but can also position the military to use other forms of AI more effectively.

Navigating the general and military-specific challenges is just the first step to AI adoption. Creating a structured campaign plan for AI can help deploy the right AI for the right problems.  Avoid the analog/digital equivalent of ineffective  rounds against a target. The adoption of AI is not just like adding another team member: It can fundamentally change how humans and technology work together. 

10. Resolve, Remodel, Reimagine

 Determine the key readiness problem sets for AI to address. Identify the data you have related to those problem sets and resolve the issues with it; that is, organize and prepare your data to yield insights.

Change how you structure your data and your organization to make best use of the insights produced by AI. Make sure you have sufficient infrastructure and talent to manage the data and its use within the organization. Remember that outputs from AI systems may need some expert interpretation before decision makers can use them.
​
Finally, pilot entirely new services and tools that apply AI to even more complex or pressing problem sets. For example, AI is already aiding in real-world scenario planning, helping airports respond to weather events and Racing teams anticipate their competitors’ every move.


0 Comments

Top 10 Artificial Intelligence Logistics Manoeuvres Enable Resource Capability at Strategic/Tactical Levels

4/20/2020

0 Comments

 
While today we may have an edge in bullets and rockets, any advantage in critical areas such as artificial intelligence and processing of digital information may be quickly eroding. With wars won and lost based on making the right decisions first, AI and information processing tools may be just as critical to victory as ammunition. 

In fact, some senior military leaders think that AI will be more important to great power competition than military power itself. The military needs a strong plan now if it does not want to find itself shooting useless algorithms at its most challenging problems tomorrow.

AI is not magic; adopting AI tools is no guarantee of success. But given the role AI is likely to play in future conflict, not adopting AI will likely guarantee failure. Therefore, learning how to effectively use AI today—with all of its strengths and weaknesses—will be critical to success on future battlefields.

Readiness—a keystone challenge for AI

Assessing readiness informs or draws upon nearly every aspect of military decision-making, from tactical operations to force structure to budgeting. To make readiness assessments and decisions effectively also requires huge volumes of diverse data from many different sources. Large data volumes, diverse sources of information, complex interactions, and the need for speed and accuracy make readiness a problem tailor made for AI to tackle. And if AI can help tackle readiness, it can help the military tackle just about anything.

We have described how redefining readiness can help bring new tools and technologies to bear and provide greater insight than ever before. At its core, this redefining breaks readiness assessments into three smaller tasks: You have to understand what capabilities are required, to know the current status of those capabilities, and to act to improve those capabilities where needed. Each of these readiness tasks involve sifting through mountains of information, teasing apart complex interactions, and then trying to understand the effects of any decision. That makes them incredibly difficult for human planners to tackle, but perfect for AI.

AI tools can tackle many different aspects of readiness, everything from understanding force requirements to increasing aircraft up-time with predictive maintenance. However, the real power of AI in readiness does not come from discrete point solutions, but from linking many different AI-powered tools together. Then, the smart output of one tool can become the smart input to another.

Putting the AI pieces in play

The term AI may be misleading in one respect. It may lead us to believe that there is just one type of “intelligence” that all AI tools aspire toward. Nothing could be further from the truth. Different AI tools have different purposes, different strengths, and different weaknesses. 

The important insight here is that AI is not a magic bullet to all problems. Until future research breakthroughs create a general purpose and context-aware AI, users must make informed choices about the trade-offs inherent in different AI tools. Perhaps the most basic trade-off is between depth of insight and model complexity, which is at the heart of any discussion of assessing military readiness. Some of the information requirements inherent in assessing readiness are simpler and can be aided by simpler AI tools.

For example, today the requirement to understand the assets required of an assigned mission is often done in the context of static planning documents. These are assembled at a strategic or operational level and infrequently change. However, even relatively simple AI can yield more dynamic and potentially more accurate predictions by making use of historical mission data. Historical mission examples and existing plans, such as operation plans OPLANS and concept plans CONPLANS, provide much information on which assets—people, equipment, and infrastructure—have been deployed around the world. Each historical mission also has unique factors, from terrain to adversary capabilities to timeline. Pairing these two types of data in an AI tool such as a neural network can allow users to make predictions about which assets are vital for success of their particular mission set.

However, other aspects of readiness require deeper insights that can only be provided by more complex models. The resources and time required to build and run these complex models mean that they are not well-suited to every situation. As a result, defense leaders seeking to know current force capabilities or how to act to best improve those capabilities face a choice. 

They can either have faster, lighter, but less reliable answers to those questions or more reliable answers but at the cost of time and resources. It’s important to understand that these are the general trade-offs military leaders face in their adoption of AI.

How would you define AI?

The term “artificial intelligence” can mean a huge variety of things depending on the context. To help leaders understand such a wide landscape, it is helpful to distinguish between the types of model classes of AI, and the applications of AI such as classifications based on how AI works; and also based on what tasks AI is set to do.

Most debate about military artificial intelligence centers on robots, but professionals usually talk logistics. Without fuel, ammunition, spare parts, and maintenance, no weapon, manned or unmanned, is going anywhere.
What’s more, while AI has made great progress in recognising objects/targets and navigating the physical world, autonomous combat robots are far in the future.

New contract to apply artificial intelligence to Marine Corps maintenance could streamline logistics and help lessen dependence of fighting forces from long supply lines. Ultimately, AI could enable the far-ranging manoeuvres envisioned by the multi-domain operations 

Marines will apply AI-driven “predictive maintenance” to part of its aging fleet troop carriers equipped with diesel engines, heavy-duty transmissions, and other features with hundreds million hours of metrics on diesel engines alone, and in the world of AI machine learning, the more metrics you have, the more accurate your predictions get.

The goal is to track the performance of each major component in real time — oil pressure, turbocharger speed, battery life, etc. etc. — and predict when it’s likely to fail.

Predictive maintenance has two benefits. First, most obviously, it lets you replace or repair a part before it breaks on you. Second, it lets you skip a lot of so-called preventive maintenance, when you pull your vehicle into the shop after so many hours of operation because that’s when, on average, such-and-such a component will need an overhaul.

There’s been a small blitz of media coverage of the contract, but it’s focused on how predictive maintenance can improve efficiency and cut costs, but there are uniquely military benefits.

Logistics has been a double-edged sword for Marines for generations. On the upside, plentiful supplies of fuel, ammunition, and spare parts in good times have kept huge armoured forces on the march. On the downside, the long supply lines, iron mountains of Marines will apply AI-driven “predictive maintenance” to part of its aging fleet troop once it’s arrived.

Marines could cope with these logistical limits when it has months to build up before the shooting started, with nearby as bases, and a relatively short distance to drive.

But logistical demands can be much greater when distances are longer with large combat formations moving along a single axis of advance, let alone supply convoys and depots.

So emerging concepts called multi-domain operations or distributed operations envisions Marines spreading out to make themselves harder targets. Relatively small units would operate “semi-independently,” moving frequently from one position to another, without resupply for days at a time.

The problem is Marines are not set up to do this today. Heavy armoured vehicles just require too much fuel and maintenance to operate this way. The long-term solution is to develop lighter and less logistically demanding vehicles, but recent efforts have been less than successful.

In the meantime, Marines need to figure out how to support the forces it has more efficiently so they can manoeuvre more freely, with less frequent pit stops for maintenance or supply runs for repair parts.

That’s where the new contract comes in. A lot of maintenance that’s done is based on what the owner’s manual says. You should go and get your oil changed and your engine checked every so many miles which can function as a baseline but it doesn’t take into account how the machine is being used and the wear and tear and stresses.
So we track not only the individual performance of specific components on specific vehicles, but also external variables like weather. Heat, cold, and humidity can all impose stress on machinery.

Where is this information coming from? It turns out the ability to put digital sensors on its products got ahead of its ability to do anything with it. A lot of machines have the sensors already on them that are producing metrics, it’s just that nobody’s listening.

Another problem is when vehicle is in a location with poor bandwidth, or if there’s a military reason to turn off all transmissions, the system can stop sending updates for a time. It can also do some of the assessments onboard the vehicle and may not have to send the results back to the central station minimising bandwidth use and transmission length.

But the big benefit is the ability to pool all available information in one place and then let machine learning figure out patterns, which can then be used to forecast future performance.

We can track general trends across a fleet of vehicles, but the real value is with prediction. Imagine if, instead of having to go to the shop for your scheduled work, you could have your status 24/7.

On the individual machine/equipment level, will the fighter unit make it through the day and do what it needs to do?

Our goal is for tactical commanders to know -- we have this many vehicles this is what the overall status is for each one so better strategic decisions can be made.

Logistics is the ‘bridge’ taking resources and applying them on the battlefield. At first, many activities which occur within this ‘bridge’, are properly controlled and coordinated, ultimately contribute to the overall ‘readiness’ of the logistics system to act when it is required. 

But many issues result from how the logistics process are not suited to the demands of the real operation when it happens. Some logistics deficiencies could have been directly addressed through improvements in resourcing.  But there are many other influential factors essential for logistics readiness, and the early performance of the logistics process at during an operation.

When readiness comes up in meeting, many leaders confuse it with preparedness terms such as a ‘notice to move’. But it is common to find that despite a unit being well within its designated ‘notice’ when time comes for action, the unit is constrained because of the availability of kit, a lack of enabling elements available in supporting formations, or slow activation of resources by strategic organizations and also other logistics factors. 

In some cases, strategic-level decisions result simply because available capabilities cannot be appropriately sustained and are unable to be deployed. No operation is free of friction caused by logistics, but in many times readiness of  logistics systems inadequate, under-resourced and inefficient.

Fundamentally, logistics readiness is the ability to undertake, to build up and then to sustain, combat operations at the full combat potential of forces. Readiness can comprise of actions undertaken during operations, but is predominantly a consequence of routines and practices set in organisation behaviour long before deployment. 

It is not a simple matter of issuing logistics units their own ‘notice to move’ or applying some other metric that will inevitably be ‘crashed’ through in a time of crisis.. Logistics readiness is a function of total organisational performance and efficiency factors that are applicable at all levels – from the strategic to the tactical:

1. Must be a high state of materiel readiness across the force. In addition to appropriately funding the sustainment of equipment, and the establishment of appropriate stockholdings in appropriate areas to enable operational contingencies, the means of sustaining equipment must be as appropriate for support operations as they are for efficiency in garrison. 

2. Failures in materiel readiness in garrison are often replicated in major sustainability issues on operations, and necessitate consequential actions such as cannibalisation to achieve desired operational readiness outcomes.

3. The logistics process, capabilities and organisations must be systematically assessed for its readiness. Every military activity or exercise is an opportunity for assessing logistics performance, but most military exercises don’t comprehensively test and assess operational sustainability and logistics readiness.. Fewer still are those exercises that test logistics readiness through a major deployment performed at short-notice; a phase of an operation that demands all supporting agencies are ready.

4. Must be timely exchanges of information; one of perennial challenges in supporting operations is knowing how far to compartment operational information, especially with commercial partners.

5.  Must be an appropriate balance of logistics resources to the combat elements. This is captured in the idea of the ‘tooth-to-tail’ ratio.

6. Logistics resources can be appropriated by a variety of means, but the important factor is the total amount of firepower which can brought to bear. Determine the amount of effective power can be delivered and make smart choices about personnel type ratio.

7. Logistics plans and policies, from stockholding policies at the unit and formation level right up to national mobilisation plans at the grand strategic / economic level must be available. Format and bulk of plans are less important than those that are developed through interagency effort, and reflecting the nature of an efficient and effective logistics process.

8. Logistics organisations must be structured to support operational requirements rather than bureaucratic needs. Although organisations may not need to be resourced to their full wartime capability during most periods

9. Organisational architecture must be established to enable the transition to an operational footing and policies in place to enable such a transition to occur rapidly.

10. Must be a mutual understanding between commanders and the logistics units, agencies and organisations that support them founded on clear execution of commander’s intent, but also the culture of cooperation set within the military or formation. 


​
​
0 Comments

Top 10 Artificial Intelligence Teams Set of Established Process/Tools Facilitate Capability Delivery

4/10/2020

1 Comment

 
​Targets emerge in seconds, incoming enemy fire puts lives at risk and shifting combat dynamics require immediate, on-the-spot decisions in a matter of seconds -- all as soldiers navigate the complex web of threats during all-out, high-risk ground-warfare.

These kinds of predicaments, which characterize much of what soldiers train to face, are immeasurably improved by emerging applications of AI; artificial intelligence can already gather, fuse, organize and analyze otherwise disparate pools of combat-sensitive data for individual soldiers. 

Target information from night vision sensors, weapons sights, navigational devices and enemy fire detection systems can increasingly be gathered and organized for individual human soldier decision-makers.

However, what comes after this? Where will AI go next in terms of changing modern warfare for Army infantry on the move in war? Teams are exploring a “next-level” of AI. Fundamentally, this means not only using advanced algorithms to ease the cognitive burden for individual soldiers -- but also network and integrate otherwise stovepiped applications of AI systems. In effect, this could be described as performing AI-enabled analytics on groups of AI systems themselves.

“Autonomy is doing things in a snipped way that can be connected. We can benefit from an overarching AI approach, something that looks at the entire mission. Right now our autonomy solves very discreet problems that are getting more complicated.”

What does this mean? In essence, it translates into a way combat commanders will not only receive AI-generated input from individual soldiers but also be able to assess how different AI systems can themselves be compared to one another and analyzed as a dynamic group. 

For instance, multiple soldier-centric AI-empowered assessments can be collected and analyzed in relation to one another with a mind to how they impact a broader, squad-level combat dynamic. In particular, simultaneous analysis of multiple soldier-oriented AI system can help determine a best course of action for an entire unit, in relation to an overall mission objective.

“What is the entire mission and possible courses of action? Do we optimize the logistics flow? Find targets as the dynamic battlefield gets more complex? The Commander can draw upon advanced AI to explore new options.

So in addition to drawing upon algorithms able to organize data within a given individual system, future AI will encompass using real-time analytics to assess multiple systems simultaneously and they how impact one another to offer an overall integrated view. All of this progress, just as is the case now, will still rely heavily upon human decision-making faculties to optimize its added value for combat. Integrating a collective picture, drawing upon a greater range of variables will require soldiers to incorporate new tactics and methods of analysis to best leverage the additional available information.

“When we have new and improved autonomy coming in, soldiers need to know how to use that. How do you keep the soldier always at the center and adapt to them as you adapt to the new AI?

One soldier could receive organized sensor-driven targeting data relevant to a specific swath of terrain, while another AI system is organizing variables to determine the supply flow of ammunition, fuel or other logistical factors.

“Data never seen cannot be learned. It is not about AI, but combining AI with a soldier who has the concept of an entire mission. AI provides information and then they get put together. When you are under fire, you are going to need different types of information.”

For example, comparing and analyzing various AI systems to get a collective picture of some kind might enable a commander to know ….“If you go this way you will use more fuel but it will be safer.”

While Machine-Learning techniques continue to accelerate the pace at which an existing AI database can quickly integrate and perform analytics on new information, AI-infused computing can only make decisions or solve problems in relation to the information it already has stored. Now it goes without saying that these databases are increasingly vast, almost seeming limitless, yet they do need to consistently be fed with not-yet-stored information of great relevance to wartime decisions.

Every autonomous system that interacts in a dynamic environment must construct a world model and continually update that model. This means that the world must be  sensed through cameras, microphones and/or tactile sensors and then reconstructed in such a way that the computer ‘brain’ has an effective and updated model of the world it is in before it can make decisions. The fidelity of the world model and the timeliness of its updates are the keys to an effective autonomous system.

“AI & Robots Crush Foes In Army Wargame with simulated infantry platoon, reinforced with drones and ground robots” How big a difference does it make when you reinforce foot troops with drones and ground robots? You get about a 10–fold increase in combat power, according to a recent Army wargame.

“Their capabilities were awesome when you can command a robot-reinforced platoon in nearly a dozen computer-simulated battles.

That mission: dislodge a defending company of infantry, about 120 soldiers, with a single platoon of just 40 attackers on foot. That’s a task that would normally be assigned to a battalion of over 600. In other words, instead of the minimum 3:1 superiority in numbers that military tradition requires for a successful attack the simulated force was outnumbered 1:3.

When they ran the scenario without futuristic technologies, using the infantry platoon as it exists today, “that did not go well for us.”:

But that was just the warm-up, getting the captain and his four human subordinates – three lieutenants and a staff sergeant, each commanding a simulated squad with a complex physics-based model so fine-grained it can assess whether an individual simulated soldier is compromised in any given attack. The  amount of information each solider gets is limited. They only know what their simulated soldiers on the battlefield could, so it replicates real-world action.

Then the wargame organizers added dozens of unmanned systems to the simulation. The immediate impact was on what the team could see. Instead of being limited to the immediate field of view of their simulated soldiers, they could send the drones ahead to scout. Instead of being able to engage the enemy about 500 meters away, they could spot and attack them from 5,000 meters 

“It was awesome to be able to increase that zone of where we knew exactly what was going on, without being right on top of the enemy. We were able to pretty much control the amount of area that probably a battalion-minus would have been able to control, with just one platoon.”

That doesn’t mean it was easy to adapt to the new tools. “The first time we used them was definitely a learning curve.”  Drones can move much faster than ground robots, but they can’t carry as much firepower as a ground vehicle of similar size and cost. So, at first the fliers rushed ahead, found the enemy position, and then had to wait for the ground units to catch up. 

Meanwhile the opposing players, controlling the enemy force, noticed the drones and, although they weren’t able to shoot them down, they could use the time to ready their defenses. The manned-unmanned team still won, but not as decisively as they wanted to.

“Our UAS [Unmanned Aerial Systems] were able to identify exactly where enemy were, but we were unable to eliminate them without our ground vehicles. You have to figure out how you’re going to mass combat power,” rather than attack piecemeal.

“As we did more and more iterations, we were able to build in more control measures and have more of … a human in the loop.After about the second or third run with all the advanced systems, , the human players were able to coordinate the air and ground robots in a single synchronized assault.

Coordinating these high-tech combined arms – aerial drones, unmanned ground vehicles, and human foot soldiers – was a lot more complex than leading an ordinary infantry platoon.  While young troops who grew up on video know how to use computer control interfaces, they may not have the tactical experience required.

What’s next? The Army wants to build actual prototypes of select technologies for a series of real-world field tests and experiments in 2020. Army will try out the individual prototypes, then integrate them together into a series of increasingly complex experiments, culminating in a full “system of systems” field exercise.

Increasing the range of the platoon’s technologies 10-fold – from 500 meters to 5,000 – increases the area it has to control exponentially. The key technology was a platoon artificial intelligence cloud, the architecture that allowed our soldiers to be able to control robotic systems that were extending their reach within that battlespace.”

Modern drones can fly themselves from point to point. The human just has to set the destination. Even ground robots, which have to deal with rocks, trees, mud, and more, are increasingly capable of detecting obstacles – which requires a lot of AI brainpower to interpret sensor data – and finding their way around them.

So the simulation assumed the robots could find their own way to an objective without a human remotely dictating every twist and turn along the way.

“We were not flying these things, we were not telling them how to drive,. We‘re saying, this 100 by 500 meter area, you have to go here,’ and they would figure out how to do it.”

“Soldiers are not controlling these systems. “They are commanding the AI cloud to control these systems.”
Now, just getting places is not enough. Probably the most complex and critical task of the artificial intelligence – both the individual AI on each unmanned vehicle and the overarching AI in the platoon cloud – is to pull together sensor data, digest it, and condense all the millions of 1s and 0s into a single picture of the tactical situation that a human commander can understand.

“The thing that made this work was that platoon AI cloud that gave you the situational awareness of what was going on with that huge area.  That level of artificial intelligence doesn’t exist – at least not yet. To simulate its effects, the wargame relied heavily on human beings, a neutral “white cell” that took the sensor data, interpreted it, and summed it up in text messages to the team. 

In the tech world, it’s called a “mechanical Turk”: human labor pretending to be automation. Developing an AI that can synthesize data this way in the real world is a major effort across the armed services, part of the wider push for what’s called Joint All-Domain Command & Control.

The human-driven process used in the wargame was intended to help the Army think about how to use such an AI without waiting for someone to build first. “It’s a surrogate. It’s a model of the real world that has met all of the requirements to be accepted for analysis.”

The next step is building the real thing – and testing that it really works

"Pentagon Proposes Automation to be Seamless as in Strategy Game"

With a switch click, unknown tanks and infantry are clear, as our tank-commanding avatar holds a tablet with the adversary positions illuminated in red. Finding these adversaries are an array of systems, from satellites to drone swarms to uncrewed reconnaissance vehicles on the ground.

Another click, and the hostile forces on the screen are replaced by scorch marks, the tank commander’s tablet illuminated with the range of strikes called in from air and land forces.

While it exists in simulations and in games, perfect information on a battlefield remains an impossibility. Creating a “red force tracker;” that is, an intelligence collection process that provides real time information on where enemies are at all times, is a stretch for current technology. But it is one that could get closer to reality with autonomous robots scouting and providing information. This would take a great degree of information integration and distillation at the point of collection to work.

Rather than remote-control or teleoperated machines, future machines could be autonomous enough to require little human supervision, employ complex tactics, and to allow for a high degree of coordination with little need for communication.

“If we want to reduce load on soldiers, we have to get the equivalent of Siri for robots. We have to get the same interaction from a human-computer interface that a tank commander has with its driver, where it can maneuver in that space.”

Consider example of the tablet-commanded robot scouts and called-in strikes. This is a vision of military command where a human sits at the center of an autonomous body of sensors, perhaps gives them objectives but not specific targets, and then lets the machines process information to convert objects recorded with cameras into coordinates for where airplanes and artillery should place explosives. It’s a vision of war almost as seamless as a round of Command & Conquer real time strategies.

1. Challenges facing military decisions clearly involve interdependences, uncertainties

2. Assessing complex domains necessitate critical judgments to increase outcome probability

3. Quantifying big data alone to make decisions is often inadequate

4. Effective analyses require qualitative information to uncover insights into behavioural domain 

5. Select metrics chosen based on ability to represent battlefield conditions 

6. Establish clear standards of spatial/temporal data reliability 

7. Account for impact of behavioral factors such as discipline

8. Use valid assumptions and establish  category testing criteria

9. Consider the probability that the analysis was wrong?

10. Identify risks to operations if the analysis was wrong?

1 Comment

Top 10 Artificial Intelligence Challenges Decisions Include Not Valued Data Remains Misplaced/Obscured

4/10/2020

0 Comments

 
​Sound and effective decisions, supported by reliable data, usually determines military operational success. Recent rapid advances in electronic instrumentation, equipment sensors, digital storage, and communication systems have generated large amounts of data. 

This proliferation of digitized information provides military leaders innumerable data mining opportunities to extract hidden patterns in a wide field of situations.

From complex information contained in this data, visualization tools and other data science methods facilitate smart decisions leaders, especially commanders and their staffs, in asking questions, developing solutions, and making decisions.

Today’s military has vast amounts of data, but it’s really not anything that is really truly AI-ready. In legacy systems we’re essentially playing the data as it lies, which gets complicated, because it’s messy, it’s dirty. You have certain challenges of data quality, data provenance, and data fidelity, and every one of those throws a curve ball.”

While the Pentagon needs solid data for lots of different purposes, not just AI, large amounts of good data are especially essential for machine learning. Fighting wars is only going to get more complex in the future.

Joint All-Domain Command & Control: This is a pilot project working towards what’s also called Multi-Domain C2, a vision of plugging all services, across all warfighting domains into a single seamless network. It’s a tremendous task to connect all the different and often incompatible technologies and organizations.

Autonomous Ground Reconnaissance & Surveillance: This involves adding analysis algorithms to more kinds of scout drones and even ground robots, so the tools can call commanders attention to potential threats and targets without someone having to watch every frame of video.

Operations Center Assistant: This project aims to streamline the flow of information through the force. It will start with using natural-language processing to sort through radio chatter, turning troops’ urgent verbal calls for airstrikes and artillery support into target data in seconds instead of minutes.

Sensor To Shooter: This will develop algorithms that can shrink the time to locate potential targets, prioritize them, and present them to a commander, who will decide what action to take. This is about making troops faster, more efficient, and more effective. Troops are still going to have to make the big decisions about weapons employment.”

Dynamic & Deliberate Targeting: The idea here is to take targets like for example, ones found by the Sensor To Shooter Tool and figure out which aircraft is best positioned to strike it with which weapons along which flight path, matching a driver with a route.

“The data’s there in all these cases, but what’s the quality? Who’s the owner of the data? “There’s a lot of data that exists in weapons systems” – from maintenance diagnostics to targeting data – “and unlocking that becomes harder than anybody expected. 

Military leaders see huge opportunities to use AI to comb through that complexity to make operations more efficient, reduce collateral damage, and protect the troops.

Military leaders have demanded visualized access to massive amounts of data to enhance their decision making, which quickly morphed into an ambition project called  Leader Dashboard.

Nearly a thousand unique data sources from its initial efforts such as training databases and equipment inventories emerged. This proliferation of data appeared limitless and provided a staggering potential to enhance decisionmaking with valuable real-time information.

Applications crossed multiple functions throughout military organizations such as logistics and risk assessments. But project developments soon revealed its data existed in different systems that didn’t talk to each other so good data was difficult to obtain and likely unreliable.

The military world is overflowing with data, which requires analyses brimming with subjective interpretations. Because many analyses support pre-existing beliefs with cognitive biases, military leaders should designate competent data antagonists . 

Leaders should use analysts they trust to uncover compelling evidence to challenge analyses and become savvy with data science and other analytic tools themselves.

Leaders should treat pertinent data that is both high-quality and reliable as strategic assets and force multipliers and strive to become open-minded and adopt adaptive leadership practices and methodologies.

Must address full spectrum of threats and rapidly change  tactics in response to new scenarios with available techniques, and procedures. It is key to use an approach that commits to decisions, but do not become permanently linked to them.

Data scientists first conduct exploratory analyses to search data for trends, correlations, and relationships between measurements. Then, they use description analytics to understand operational aspects of the data, such as data summarization with basic statistics deviations to calculate combat power of operational units.

Using complex statistical techniques together with machine learning and probability theory, predictive analytics are used to uncover relationships between data inputs and outcomes. 

Data analytics enhance capability by using more data points instantaneously to transform asymmetries of data into useful information. Overcoming behavioural limitations and biases, data analytics allow military leaders to make quicker decisions with more valid, dependable, and transparent information, providing valuable input into military decisions.

Analysts exploit tools to unlock secrets hidden in a huge cache information, available today measured and stored in digital bits. As an integration of talent, tools, and techniques, data science is really an art of transforming data into actionable information needed for decisions. To obtain data and report information, the emerging tech capitalizes upon data cleaning, data monitoring, reporting, and visualization processes.

Previously, data collection and analyses were expensive, requiring paper-based reports. Now with advancements in sensor and satellite technologies, military leaders began to obtain real-time access to data remotely on almost any topic,  providing better opportunities in obtaining clearer pictures of situations that were swiftly adjusted to updated circumstances tailored to specific missions. 

Applying more complex techniques made up of modeling and simulation efforts, data scientists use prescriptive analytics to determine probabilities of potential outcomes based upon deliberate changes to inputs.
If available, qualitative techniques, such as decision theory and war gaming, further improves data understanding. To reduce technical barriers and make it easier to provide data-relevant information for its users in dispersed locations.

Computational support systems such as data warehouses, data mining, virtual teams, knowledge /optimisation and management systems augment these qualitative techniques.

Advances in data analytics have impacted military operations.  Using updated networks with advanced number-crunching tools, analysts have identified significant bottom-line improvements impacting decision making capabilities for all of its activities.

While there will “minimum common denominator” standards for tagging metadata with various categories and labels, “you will have lots of flexibility for mission-specific tagging.” It’s a tremendous task, but one with equally tremendous potential benefits. “We are trying to fix all sorts of problems with data across the Department of Defense,” not just for AI.

“AI will likely become the driving force of change in how the department treats data. Technology is changing so fast that the painful data processes we endure today may well be transformed soon into something entirely more user-friendly.

But many challenges will remain well into the future. There’s no simple silver bullet solution. Some suggest rigorously imposing some kind of top-down standard for formatting and handling data, but the Defense Department has too many standards already, and they are inconsistently applied.

“There are a lot of people who want to just jump to data standards. But very weapons system that we have, and every piece of data that we have, conforms to some standard. There are over a thousand different standards related to data today. They’re just not all enforced.”

“It’s less a question of standards and more of policies and governance. We now have to think about data as a strategic asset in its own right. Now, a much better approach to drive interoperability is to start with a discussion of metadata standards that are as lightweight as possible, as well as a Modular Open Systems Architecture. Or put another way, we need to agree on the definition of ‘AI Ready’ when it comes to our weapon systems.”
The problem with Pentagon AI projects is that Leaders don’t know what they want.

Imagine you want a car with a sunroof. Do you buy a car that has a sunroof, or do you buy a car that doesn’t have one and pay a mechanic to take a blowtorch to it?

You really don’t want to reinvent any wheels by integrating new tools you don’t have to. All too often, “leaders believe they can just hand their product over, it goes into network infrastructure, and it’s magic and it works.” In reality, it takes a lot of time just to install the product. 

Installation and integration aren’t the only problems. Leaders want to incorporate AI so it can pool data previously scattered across the bureaucracy, making it easier to analyze.

The military world is overflowing with data, which requires analyses brimming with subjective interpretations Because many a

A new workforce will have the ability  to uncover compelling evidence to challenge analyses. Finally, they should use analysts they trust or, better yet, become savvy with data science and other analytic tools themselves. AI relies on the fact that, in any large set of data, clusters of data points emerge  that correspond to things in the real world.

“Difficulties Using  AI to  Compare Different Information Domains “

Certain kinds of information collection systems might be unique to individual data sets. Resolving potential differences between what the essay calls “multiple interactions” might prove difficult.

Future AI will need to “analyze the integration challenges of different AI approaches—the requirements for delivering reliable outcomes from a range of disparate components reflecting the conventions of different information domains.

All this being said, the current and anticipated impact of fast-progressing AI continues to be revolutionary in many ways; it goes without saying that it is massively changing the combat landscape, bringing unprecedented and previously unknown advantages.

“Tech Infrastructure Adapt Facing New Threats”

AI is progressing quickly when it comes to consolidating and organizing data from otherwise separate sensors on larger platforms, such as an F-35 or future armored vehicle…. yet integrating some of these same technical elements has not reached dismounted infantry to the same extent.

For example, emerging algorithms can quickly distinguish the difference between someone extending a weapon or merely digging a hole -- or recognize enemy armored vehicles. The AI-empowered system could also quickly cue a combat analyst so they don’t spend time pouring over massive amounts of data.”

The concept here is not so much the specific systems as it is a need to employ solid engineering. “In some cases the better chance of victory will be due to faster adaptability. Creating intelligent systems that are able to self-adapt to Soldiers' needs and seamlessly adjust as Soldiers adapt to the changing situation promotes rapid co-evolution between Soldiers and autonomy.”

Teams are engineering the standards through which to create interfaces between nodes on a soldier or between groups of soldiers. For instance, some of these nodes could include laser designators, input from radio waves or data coming in from satellite imagery overhead. 

“Computers are so much faster, and algorithms are now being advanced to “train at scale” to analyze a series of images and pinpoint vital moments of relevance.

““We build out algorithms we could run on some kind of soldier-worn system such as a small form factor computer, thermal imaging, daytime cameras or other data coming in quickly through satellites. When you network all of this together and bring in all the sensor data, machine learning can help give soldiers the accurate prompts. “The more we do this, the smarter the algorithms get.”

“Decisions Based on Intelligence at the Speed of Conflict”

Accessing information on the battlefield presents challenges. Every second matters during a critical mission, and you don’t want to waste time sorting through millions of documents looking for the right intelligence.
 
AI can help create a common operating picture for more reliable situational understanding. That means the system only pulls the most important information for your current situation, giving you accurate, legible intelligence faster. 

To fully understand the situation, military leaders had to maintain a good grasp of the analyses used, else they could become prone to making poor decisions. For example, there could be incorrect coding, including data reporting bias, giving a false sense of reliability that led to unwarranted decisions. 

Before the onset of data analytics to uncover information, military leaders used outdated guidelines to reduce complexity and to fill knowledge gaps, often based upon historical outcomes.  Additionally compounding decision making today is the abundant quantity of data that masquerades as valuable information while more pertinent data remains misplaced and obscured. 

Challenges facing military decisions clearly involve interdependences, uncertainties, and complexities such that military leaders need critical thinking to increase probability of desirable outcomes.

Since warfighting enterprise is a complex domain of human beings and their various personalities, quantifying big data alone to make decisions is often inadequate.

Instead, effective analyses require qualitative information to uncover insights into the human domain of warfare. To address these challenges, military leaders should demand answers to the following questions from their analysts:

1. Treat pertinent quality/reliable data as force multipliers assets

2. Adopt adaptive practices to address rapidly evolving threats

3. Commit to decisions, but do not become permanently linked to them.

4. Connectivity issues could leave your analysts in the dark

5. Even when the connection is clear, there’s too much available information. 

6. Selected metrics chosen instead of the many other metrics

7. Reliability of the data used include human factors

8. Test risk factor assumptions before data is used

9. Calculate probability that the analysis was wrong
​
10. Determine potential risks if the analysis was wrong



0 Comments

Top 10 Artificial Intelligence Network Task Execution Tools Data Moves Faster in Tactical Mission Space

4/10/2020

0 Comments

 

​Each individual drone and ground robot needs its own narrow AI to navigate over terrain, analyze data from its sensors, and communicate with the rest of the force. But the most important AI is an overarching artificial intelligence to coordinate the whole platoon – an AI that doesn’t reside in any one physical location, but exists in a wireless cloud.

Instead of a single, central supercomputer that could be blown up, hacked, or have its communications jammed, the coordinating intelligence is distributed across multiple mini-servers carried by robotic vehicles and, potentially, individual soldiers. If one server is destroyed or loses communications, there are still others on the platoon network.

Battlefield networks have to overcome problems no commercial system faces, such as an extensive arsenal of electronic warfare systems to detect and jam transmissions. Army decided its tactical network was far too vulnerable to hacking and jamming, so it rebooted the entire modernization effort.

One factor that makes this easier is the limited range involved. A five-kilometer radius from the platoon commander is a long way for traditional infantry operations, but it’s pretty short compared to many military communications systems. What’s more, with 40 soldiers and about as many unmanned systems spread throughout that area, weak signals can be relayed from radio to radio to radio, crossing long distances in several shorter hops.

Another crucial factor is limiting the bandwidth required. First-generation drones like Predator require human operators remote-controlling everything they do. Basically, they still have a human crew, the crew’s just not inside the vehicle, so that requires an uninterrupted full-motion video feed from the drone to the operators to see what they’re doing, and an uninterrupted stream of moment-to-moment commands from the operators to the drone.

As part of its tactical network modernization strategy Army is reducing complexity for users at the tactical edge and arming them with the network capabilities they need to defeat increasingly advanced adversaries.

To accomplish certain missions in today’s fight, commanders may need to reassign certain units, such as moving a company to a different battalion. But such a move requires signal Soldiers to re-provision the unit’s vast number of network systems with new data and software, including new applications, firewall configurations and initialization data products. These products are assigned to each unit before deployment or training events, to enable the systems to run on the network. 

When a unit is reassigned, new data products are needed to support the new assignment. These products include unique identifiers, roles and Internet Protocol addresses, taking into account a unit’s specific mission, personnel footprint and mix of networked mission command systems. The Army refers to this process as unit task reorganization.

In the face of potential peer and near-peer threats, the Army needs dynamic and flexible network re-provisioning capabilities to reflect changes in mission and assigned units. In the past, signal Soldiers manually conducted the provisioning and re-provisioning process one device at a time, with physical cables connecting each node to the network, which took many weeks, depending on the equipment and size of the unit. 

More recently, new Army capabilities are enabling over-the-air provisioning and security patching, which could, for example, speed the time it takes to provision a brigade’s worth of on-the-move, network-equipped vehicles  without having to take the entire system offline in the process. The implementation of an Army software-defined networking design could speed that process even further..

The Army is also looking to leverage software-defined networking to increase security in the tactical network by enabling rapid response  patching and configurations in support of offensive and defensive network operations. 
“Data and Defense: How to Boost Readiness”

Defense and intelligence agencies want to leverage the data they collect so they can use artificial intelligence to enhance readiness – but most who run these programs don’t know how to get started and find it difficult to make the business case to their leadership. So what do we mean by readiness?

It’s the ability to execute on missions critical to national security and keep Troops safe. Current readiness enables units to do two things – execute core functions and perform assigned missions. Measuring a variety of data, including various personnel, equipment, equipment serviceability and training statuses, is important to ensuring this readiness.

Based on  incoming operational data, decision-makers want to know how best to respond based on a globally integrated understanding of both demand and capacity, and the impact on wartime readiness.

To develop options, redirect mobility efforts based on changing priorities, or find alternatives when operations are disrupted, component commands depend on analysis that transforms data into useful information.
The integration of analytics into command processes and decisions capture and integrate data; create visual representations of key indicators; forecast workload, network, or asset availability; and account for constraints so that energy is focused on the most important work.

“Above all else,  leaders must be able to deliver insightful products and recommendations, with unwavering confidence in our credibility. The data is absolutely necessary, but we’re in the business of delivering analytic products and enabling a broader community of analytic practitioners to transform the data into actionable information on complex issues and address challenging problems.”

“Warfighting readiness and enabling consequential decision-making are at the top of the list. These priorities are critical components for future success and, thus, the target of our analytic efforts.

“Searching for Specific Tactical Network Design Solutions Enable Tactical Operation Decisions”

As the efforts gain steam, teams plan to leverage an open-standard design for easy integration and promote innovation while keeping costs down through increased competition. System developers need to ensure that they understand the degraded signal challenges in the Army’s network, which are much greater than in commercial networks, as well as other specific objectives to include:

Assisting the Army in rapidly provisioning tactical network nodes:

Software-defined networking experimentation has shown decreased provisioning time, especially when paired with virtualization and containerization, which further reduces the overall data size and speed of provisioning.
Supporting rapid unit task reorganization:

 The Army needs dynamic, flexible re-provisioning to reflect changes in mission and assigned units. This functional gap extends beyond the traditional software-defined networking capabilities and needs to allow for the tailoring of each tactical network device.

Optimizing routing in the tactical network:

 There is a need for software-defined networking to behave opportunistically. Because of the Army’s degraded network challenges, software-defined wide area networking solutions must enhance the network when the remote network controller is available, and enable nodes to operate independently when it is not available.
Simplifying network management:

Experimentation reveals that automating network configuration changes makes it easier for the network node operators on the ground. However, network management, including configuration changes, can still be quite complex for the  signal Soldier team to execute. There is opportunity to automate many of these functions.
Increasing security in the tactical network:

The Army is looking at software-defined networking to assist in rapid response through centralizing the ability to conduct changes to security policy, patching and configurations to support defensive network operations. This would enable Soldiers at the remote controller location to send out patches or updates throughout the entire network.

“Demand and Capacity Use Case”

Efforts to develop predictive demand forecast capabilities across multiple mission domains are aimed at maximizing effectiveness and efficiency in the use of limited assets and constrained networks and nodes, which are a reality in daily, as well as wartime, globally integrated operations.

The resulting analysis is aligned with component capacity assessments and arms decision-makers with information that will justify actions to meet mission needs and support the warfighter.

The demand and capacity forecasts will be linked across multiple command activities from rate setting and budgeting, to operational projections to develop optimized transportation solutions and readiness management.

The services have  successfully applied demand and capacity analytics using modeling and simulation. Leaders must  identify any mobility capability gaps and shortfalls, describe the associated risk in conducting operations, and recommend mitigation strategies, where possible; and will also include the near-term mobility implications of emerging warfighting concepts.

“Aerial refueling capacity and sealift recapitalization analysis are among the most compelling analytic needs of the enterprise and are major focus areas functional team training.

The  assessment is expected to identify potential mobility capabilities, operational approaches, and necessary mitigations to shape how we think and posture for the future. “We’re studying how we adapt and think differently in terms of both technology and operations to ensure our systems are  capable and relevant in the future.

“Mainstreaming data analytics into operational planning efforts will create actionable information for decision-makers.  “The time and energy previously directed on lagging, low-value activities can now be applied to proactive, higher-order decision making options where commander judgement is most appropriate and useful.”

“Troop Assessments Determine Potential of Integrate Networking Tools”

 Networking in tactical military environments improves capability by moving data storage from a device to a  data storage facility, software-defined networking is a network modernization approach that relocates network routing control functions at a secure remote location.

The Army understands that to receive better, more tailored solutions it needs to share open application programming interfaces and use cases to  include interfaces for accessing initialization data; integrating to network operations tools; accessing network condition information; application-aware routing that allows applications to respond to the network’s availability; and application self-provisioning.

As part of the basic networking, before information is transmitted, it is broken up into smaller digital data packets. The network then chooses the best path, or route, to send each data packet and, once packets reach their destination, the network reassembles them. 

The network performs two basic processes on the data packets-one process focuses on forwarding the packets to their destination and is referred to as the “data plane,” and the other focuses on routing the packets and is referred to as the “control plane.”

 In the Army’s current, traditional network, these two process planes are located and implemented together at a local level by a tactical network node’s hardware networking operating systems. On the other hand, in a software-defined networking design, these two process planes are separated. The data plane forwarding functions remain with the local network device, but the routing control pane functions are extracted, turned into more dynamic software, and centralized at a network operations facility, or in n installation network environment, where they can be managed collectively by experienced signal Soldiers.

The remote routing controller knows all of the nodes that it can manage, and it can sense when there is congestion in the network or when there are dropped data packets, due to things like bad satellite connections or enemy jamming. 

Through metrics embedded in the software, this intelligent controller can sense the most efficient path available and tell the nodes in the network to route around the issues. The Army’s current software-defined networking efforts are setting the stage to optimize routing even further by leveraging machine learning when the required technology becomes available.

“Overcoming a Degraded Network Environment”

A software-defined networking design could enhance system and network simplicity for tactical users, since it moves some of that network complexity to network operations center. However, the Army will have to leave enough of the routing control functions locally, within the tactical device, to get through network challenges found in degraded signal environments. These degraded network challenges include network transport environments that are highly latent , disconnected, intermittent and with low bandwidth.

The tactical network is an interconnected mesh design, with different-sized line-of-sight and beyond-line-of-sight systems that exchange data over different frequencies and multiple transmission paths. Together these unified systems enable secure network connectivity and data exchange across the force, from a large command post down to the Soldier on the ground with a handheld device. 

But degraded network challenges are inherent in the Army’s tactical network, and not just because of its size, breadth and complexity. Connectivity issues can also be caused by topography like mountains or buildings that block signals; on-the-move communications; or, increasingly, enemy jamming.

In recent pilot efforts with operational units, the Army has been experimenting with both software-defined networking and software-defined wide area networking. These laboratory experiments and operational unit pilots are underscoring the need for solutions to detect and route around network interference and congestion, and to load-balance flows across multiple transmission paths, to increase network speed, performance and reliability.

Additionally, the network will need to have a fallback to compensate for degraded network emergencies, when the tactical network systems on the battlefield can’t “talk” to the remote network routing controller. To offset these scenarios, software-defined networking solutions will need to incorporate capabilities such as initialization data products and basic router configurations that reside locally, which the tactical network system can leverage until stronger network connections to the remote intelligent routing controller are restored.

If the Army switches to a software-defined wide area network design, the remote network controller will need to include software that implements a strong and automated primary, alternate, contingency and emergency routing plan, so that it can automatically route and reroute signals over multiple transmission paths, choosing the strongest available paths for optimal connectivity and resilience. 

Army wants to ensure continuity of operations, to enable network routing to be seamless and transparent to the tactical user, so Soldiers can focus on the mission and not the network. Networking goals include:

1. Reduce complexity for the tactical user 

2. Simplify network management for communications officers

3. Achieve ability to rapidly provision and re-provision network nodes 

4. Configure nodes based on mission 

5. Prepare nodes for operational use on the network.

6. Improve network resilience

7. Include automated primary, alternate, contingency and emergency routing plan.

8. Increase network security.

9. Facilitate network management/administration 
​
10. Make signal prioritization easier, more flexible and effective
0 Comments

Top 10 Strategic Tactical Operational Plans to Build Sensors/Weapons Support Multi-Domain Operations

4/1/2020

2 Comments

 
​Navy wants its ships, submarines, drones, and on-shore intelligence analysts to be able to share data in real time, across domains — without fail. 
​
Disaggregated/Networked multi-domain force design enables adapt/create unpredictable, resilient operational compositions scalable to low-end conflicts without wasting capabilities/capacity.

“We deserve better information to the decision-maker. The information is there, somewhere. We’re just not getting it in the right hands at the right speed.”

MDO reflects how smaller force structure elements can be rearranged into many different configurations or force presentations design employs many diverse, disaggregated platforms in collaboration with current forces to craft an operational system. Functional capabilities hosted on a common platform like a combat aircraft, such as radar, fire control, and missiles, can now be disaggregated into their smallest practical pieces. 

MDO is a force design that combines the attributes of highly capable systems with the volume and agility afforded by smaller force elements that can be rearranged into many different configurations or presentations. 
MDO coordinated system is made possible by the use of advanced networks, data links, and enablers that employ automation and artificial intelligence/machine learning to connect its disparate capabilities,.

MDO leverages the dynamic relationship between force structure and operational concepts—means and ways—to regain offensive initiative against enemy systems warfare. Said another way, a traditional approach to dealing with emerging threats is to devise new, more effective ways to use existing military forces, or acquire new capabilities that will improve a military’s ability to perform its missions. The MDO concept does both.

MDO force builds upon current investment in important, highly capable systems to yield smaller, more numerous, disaggregated elements.  Disaggregated elements may network together to create a coherent operational system in partnership with highly capable system platforms.

Systems warfare strategies target data links and their nodes to collapse the effectiveness of a system. In a MDO force, there are no single points of failure, no single data link, no universal standard, no one type of waveform on which enemies can concentrate. This is the point of functional decomposition. 

The key to MDO design is the quantity and the composition of the nodes it can create in an area of operation. Disaggregated elements contribute to system resiliency, because their loss represents the loss of only one function and not the many functions of a highly capable, traditional platform. 

Networked together, disaggregated elements can create out-sized value to the force The larger system functionality continues even when attrition occurs, since there is no single node or small set of nodes whose loss will collapse the entire system. Disaggregation also expands the number of potential kill paths, posing a targeting problem to an adversary.

“We should be able to, with this level of technology, what I’m seeing in my carrier should be available in the Maritime Operations Center MOC in real time so that when a fleet commander hears instructions like , ‘these are my intentions and I’m engaging here,’ the commander can look at it and go, absolutely, and I’m moving this or that or whatever the case may be to support you.

Military has thousands of soldiers distributed across the battlefield, thousands of vehicles.  That requires a very different scale of networking and AI than connecting smaller numbers of more expensive platforms, from Navy warships to Air Force jets to even Special Operations teams.

“We have to have the ability for that operator, when he looks at that track, to have confidence, whether it’s coming from an unmanned vehicle 200 miles away, that it’s the same thing they’re seeing on a cruiser. 
 
That data needs to be readily available across the network so when he tells commander that he wants certain targets hit, or others to be left alone, everyone involved from the top of the command chain to the bottom understands what is happening, and what may or may not be a threat.

Requirement for integrated combat system gives common operating picture to leaders at sea and ashore, as well as integrate in information from different kinds of sensors and use that to support both hard-kill and soft-kill responses from the Navy. 

Navy is making progress linking sensors and other sources of intelligence and data, lethal and nonlethal countermeasures and a range of levels of command.

Navy is making integration of ships, planes, sensors and weapons a priority going forward and is in the requirements-writing stage of development an integrated combat system.

“It’s every ship, it’s every radar, it’s every airplane, it’s every weapon. And if we don’t optimize every one of them, the margin of victory is so slim right now we risk defeat. That’s how leaders are  approaching  strike group command, and pushing team to develop the requirements for an integrated combat system.

All those leaders, from the cruiser to the strike group to the fleet commander, should also have a level of awareness that extends to what sensor is doing the tracking and therefore what its limitations might be; and what weapon is most appropriate to go after the threat, so it’s effective but not wasting a costly high-end weapon to defeat a less-capable target.

While leaders at sea might have access to this information, those at the Maritime Operations Center Maritime Operations Center MOC ashore see a lag in getting those details. Navy wants everyone at all levels to have access to the same information in real time “so we can make good, sensible decisions and husband our weapons wisely against the threat. Or not react at all, if that’s the most prudent case, depending on the mission.”

Challenges to Multi Domain Operations include linking disparate communication and information networks to share targeting data and communications. This stuff is not there yet. It's less about the platforms, We have got to have the communication architecture. All the ability to do that is there, whether that’s secure waveforms or the Radio Frequency RF Links, but we have to go out and experiment right now.

"Pentagon Experiments Trying To Link Everything on the Battlefield Using Multi-Domain Operations Concepts"
Experiment by experiment, the company is weaving aircraft, ground vehicles, satellites, and the rest into a network that will someday give commanders unprecedented decision-support options.

The Pentagon’s efforts to digitally connect everything on the battlefield has a big challenge to overcome: getting disparate vehicles and weapons to share data.

“The interoperability of various, different systems, that’s really where we are struggling. We don’t have that machine to machine connection to begin with.”

Over the past several years, we’ve been working to build those connections, piece by piece and plane by plane. We started by asking, “How would we go fight in 2030, 2045?” and then working backwards.

There are efforts underway to link the stealthy F-22 and F-35 combat jets. The Air Force has announced that they are to test a similar link next month, but the Air Force is establishing more complete linkages, including new forms of secure radio linkages using software defined radio, and also including other assets such as drones.
Experiment by experiment, we tried to “systematically work” to build the components of a larger network of networks.

Ultimately, we want all this to add up to a “virtualized cloud-based architecture like the branches of a tree. A handful of ships and planes might form one network. That will, in turn, connect to a larger network that would, in turn, would be connected to the larger network.

“You end up with virtual networks on the edge with a computing architecture you could have on an aircraft, on a ship, or any of the deployed nodes..

Bringing all these pieces together will enable a new sort of operating system for warfare, a new experimental battle management display to illustrate the concept.

The system presents the operator with a list of effects, from devastating explosions to a quiet disabling of some enemy system; a list of available assets, including planes or drones; a map of targets; and recommendations for the best way to deliver effects to targets.

As circumstances change — fuel gets low, ammunition is depleted, targets are destroyed, new enemy forces arrive, etc. — the system can send out alerts that a new plan is needed — or automatically update the plan with new instructions for pilots and drone operators. It all depends on how high the operator wants to set the autonomy.

A run-through of “what we need to win” includes the amphibious force, the submarine force, mine warfare and mine countermeasures communities, combat logistics and more. While much effort has been put into creating and strengthening the Naval Integrated Fire Control-Counter Air structure for the carrier strike group force, other communities in the Navy have been left out of previous efforts.

“We are going to do things differently, we are going to do things in a completely netted environment. … We have the weaponry to go a lot farther than we’re able to do because of the sensors.

Priorities include long-range targeting, which will require these netted sensors; a universal common operating picture and combat logistics.

Air Force is preparing an experiment it hopes will link the F-22 and F-35 fighter jets, the first in a series of experiments dubbed “connect-a-thons.”

The goal is to identify a fleet of aircraft with a communications issue, invite voices from inside and outside the Pentagon to offer solutions, and then test those offerings in a live experiment.

The F-22 was built with an older data link that can’t match up with the Multifunction Advanced Data Link system used on the newer F-35; while the F-35 can receive data , it can’t share the data back — a key capability given the envisioned role of the F-35 as a major sensor for the future Air Force.

For the test, the service will use a “universal translator” for the two jets. The first test, will feature the equipment on a pole on a test range, with the jets pinging their information back and forth from that fixed location.

It’s not the first time a drone has been used as a link between the two fighters. Global Hawk unmanned system, equipped with a new radio has been designed to act as a translator between the aircraft. A cloud-based common operational picture that tracks where friendly forces are and displays a map of their constantly updated positions.

Why is this so difficult? As stealth aircraft whose whole raison d’être is to evade detection, the F-22 and F-35 would rather not use conventional radios to communicate in combat because the transmissions are too easy for an enemy to pick up.

So both jets use so-called Low Probability of Detection/Low Probability of Interception communications – but they each use different ones that operate on different frequencies with incompatible software. F-22s use a unique Intra-Flight Data Link that works only with other F-22s, while the newer F-35s use the Multifunction Advanced Data Link , which can only talk to other F-35s.

Of course, realtime data sharing across platforms isn’t a simple or clear-cut affair, even after successful experimentation. It’s hard to simply to share data between operators and just one platform. The challenges of sharing data between multiple platforms, in the middle of battle in a highly contested airspace, are far larger.

1. As a force design, multi-domain operations provides guiding principles for developing operational concepts and future technologies, conducting operational planning, presenting forces, and commanding and controlling them in real-world operations. 

2. It is expected that elements of a multi-domain force will mature at different paces, resulting in their gradual integration into the force instead of creating a hard delivery, sudden, sweeping force transformation. 

3. Multi-domain force design concept will encourage the development of new, highly interoperable capabilities designed to speed through the acquisition system. 

4. Multi-domain force is less about “what” a new system is and more about how it will behave within a broader enterprise. Swarms of expendable systems may be a design element of a future force design, as will other weapon systems and concepts. 

5. The plan will create a heterogenous mix of many different types of elements, functions, and capabilities that can collaborate in unexpected ways to complicate an adversary’s planning and targeting. 

6. As multi-domain force elements are fielded, they will change how highly capable platforms are employed, further enhancing their value and effectiveness. 

7. Multi-domain offers an alternative to creating a single architecture or standard through  diversity, complexity, and resilience. 

8. Imposing a unitary requirement on every platform in a combat zone is costly, results in the procurement of systems that are quickly obsolete, and allows competitors to understand and adapt to new systems quickly. 

9. Multi-domain will require systems to be interoperable, but this requirement does not require a single standard. Using many different types of data links, waveforms, and message formats will increase the resiliency of networks. 

10. Multi-domain is not a tightly integrated system of systems where the failure of one system could impede the development of others or collapse the whole architecture 
2 Comments

Top 10 Tools Build High-End Capable Platforms Provide Value in Future Multi-Domain Operations

4/1/2020

0 Comments

 

​The goal of Multi-Domain Operations is to get something that works well enough to test in real-world conditions and get feedback from real pilots. Then you take that data and improve your solution and run the improved version through another test- then rinse and repeat until you get something good enough to field to actual combat forces.

We have product categories that we care a lot about. We want to be able to integrate sensors. We want to get data off of them. We want to secure the process. We want to be able to put applications on the system and connect capability and people together. And we want to output an effect, like jamming a radar, to hacking a network, to blowing everything up.

The system features new methods of data sharing between air and ground forces, a common operational view that can track updated positions, and most prominently, a data connection to allow F-22s and F-35s to share data without exposing their positions.

The F-35 was designed to take in large amounts of data regarding battlefield positions and situations. The F-22 has a more limited mission and capability -- and the F-22's Intra-Flight Data Link and the F-35's Multifunction Advanced Data Link are currently incompatible.

The fighter planes, and other platforms, have different communications protocols and radio frequencies, and were not designed with a digital gateway to integrate their communications capabilities.

"The main point is that we want both the F-22 and the F-35 to be able to share communication over a link that allows them to do so in a way that protects their survivability."

Imagine a network of manned and unmanned systems, with relatively expendable drones actively emitting signals while the rest of the force stays silent, stealthy, and survivable. Imagine a multi-domain command and control network that can pull together forces from air, land, sea, networks reorganizing as needed on the fly. The goal: create a dispersed, flexible force our adversaries’ centralized systems can’t keep up with.

How would that work? A commander inputs the task the force needs to do, identifies the units to be made available for tasking, enters some constraints like geographic bounds, timing, etc., and the system comes back with some proposed courses of action.

To develop the courses of action, the system runs an auction across all the units available to determine which can best accomplish the tasking like match passenger’s desired pickup points and destinations with available drivers, in seconds, millions of times a day.

Of course, the artificial intelligence driving this kind of Joint All-Domain Command & Control system would be complex, distinguishing between ride type and it would need to know the capabilities of different types of drones, planes, ships, ground vehicles, satellites, and more.

Then it needs to calculate which was best able to do a mission based on both its inherent capabilities and its current location. Instead of just knowing where to drop off a passenger, it would need to figure out the best kind of munitions to drop, or jamming to conduct, or network tool to deploy, against a wide variety of targets.

Traditional military organization is like a jigsaw puzzle, where every piece can fit in one and only one place in the larger picture; the future organization needs to be like a set of tiny building blocks can be combined in all sorts of ways to make an infinite variety of images.

This kind of networked force could survive enemy attack – physical destruction, hacking, or jamming – by reorganizing itself to pass data around the damaged nodes, making it difficult for adversaries to knock out by jamming a few key links or physically destroying major headquarters, bases, ships, and satellites

“The tools available to field commanders are insufficient to enable them to develop and plan creative operations. As a result, commanders, particularly junior ones who lack large planning staffs, will tend to fall back on doctrine, habits, and traditions that the enemy can predict.” We heed to enable leaders up and down the chain of command to creatively plan, adapt, and recompose their forces and operations.”

These new tools would help commanders rapidly retask and reorganize a new kind of force. Instead of relying on large, powerful aircraft that can do all aspects of an electronic warfare mission alone by themselves – which simplifies both US planning and the enemy’s countermeasures – the future force would disaggregate capabilities across multiple manned and unmanned platforms.

Expendable drones might emit radar signals, while other drones and manned systems would passively receive the radar returns, then compare notes over hard-to-detect datalinks to figure out where the enemy forces were. Other expendable drones – possibly launched from a manned mothership — could transmit the powerful signals required for jamming, but every unit in the network would have the capacity to passively listen for 3D game model transmissions.

“Basically, we address the one thing on everybody’s mind. “How do you do more for less. Budgets aren’t growing, so we really want to effectively manage budgets. This is a great way to do it, because you’re not bending metal, you’re not trying to find a way to put a new component in. You’re able to take whatever’s out there and put it in fast and see what the improvements are.”

“Plans are nothing; planning is everything”.  But there is always a question hanging around at this point: How do we know what the ‘correct’ lessons to learn are?  And did the outcome happen because of the planning or the execution?  It is cheap and simple for units to conduct planning exercises and train their planning staff.

But what about the execution?  What about the current operations team?

Some say command post exercises current training model does not meet modern requirements.  More time and resource needs to be directed to those making decisions and supporting the execution of an operation.  In presenting some potential solutions to improve execution training, it becomes clear there is just not enough invested sufficient time or effort into training the execute.  

In contrast to the case for more realistic field training, there is also the case for more focus on decision making. Current operations teams should be able to play, and replay, decision-making reps to hone their ability.  This would allow a focus on experiment and to be able to learn from failure.  

Planning Versus Execution

When the execution tent receives the hand over take over of the plan the planning team have completed their part and can move on to the next planning cycle.  “Plans are of little importance but planning is essential” then the value is held in ‘the planning’.  In contrast to the plans team, the operations cell have not benefited from the ‘planning’ and only have the ‘plan’.  

An inexperienced current operations team can butcher a good plan, but a well-drilled and experienced current operations team can save a bad plan. Operations teams, then, need more training and experience for when it all goes wrong.  There is currently no way for units to know if the outcome is due to the quality of the plan or the quality of the execution.

After action reviews normally take one of two formats.  The first is a group discussion.  These discussions are usually dominated by the most vocal team members with a linear view of the action.  The second is a series of smaller working groups, which attempt to get a wider collection of points from a more diverse pool.  Both formats invariably end up as an analysis of ‘why did we make the decisions we did and what lessons can we take away from the consequences’.  Both approaches lack diversity of ideas and are fixed in personal biases.

As this approach looks at how the action unfolded and why, there is no way to analyse whether or not the specific decisions that were made were the best decisions for that situation.  This is because once a decision is made and action is taken, that unit is creating the one path of reality.  There is no scope to explore alternative future realities and this limits the staff ability to experiment and learn from failure.  

The ‘improve’ points generated might have been the best actions for the situation.  By the same logic, they may not be.  Current training design does not allow the staff to actually know if they should ‘sustain’ the lessons identified if there is no comparison against which to assess the relative merits.  This approach lacks both data and intellectual rigour.

The feedback that military training audiences receives during an AAR is either an internal introspection as units discuss their opinions, or external subjective feedback-- with some stats, e.g. vehicles destroyed bolted on from the observer. Both modes of feedback will have been shaped by those individuals’ experiences and biases.  The key point is that the military would benefit from training design that allows units to repeat scenarios.

Reps, Reps, Reps

It is well-known that repetition and purposeful practice hones performance through iterative development and compounding knowledge.  In order to best exploit this, there needs be a format of training where more decisions can be made and a system in which those decisions and actions can be judged and compared.  Military currently has a good model of training in so far as it allows free play ‘fighting’ either deployed in the field or through simulation.

How do we encourage the consideration of tactics today?” This is especially pertinent if operators have no way of rewinding to try different tactical decisions. In these scenarios there is only ever one outcome to analyse and this limits the utility of the AAR process.  There is no way to understand what is good and what is bad.  

These training systems are good because they create friction and unpredictable dilemmas in which to test military operators. But if we don’t fail, then have we even tried?

Our execution teams run through one sequence of missions in a linear manner several times a year.  This is not enough to inculcate good practise and hone the skills needed.  Either more time needs to be given to the execution of missions to put those in current operations through the necessary iterations and frictions, or there needs to be an alteration in training design to maximise the reps.

What can we do about it? Change training design. 

Military has acknowledged the requirement for more execution, but it has moved in this direction cheaply and slowly.  Some wargames offer a method of training the execution team.  Yet despite stemming from a brilliant idea, to find a cheaper way to execute, the game is confusing and unintuitive.  Worse, Military has yet to invest sufficient time into it.  More must be done if wargaming is to survive, let alone thrive.

There are better systems already in place, which can be used to simulate tactical actions and generate different scenarios.  The benefits of electronic systems is that they are easier to use and they already exist across the infrastructure. Whatever system is used, training design must be changed to enable more reps. 

Both systems should be used like a chess player honing their skills.  Chess players can set up boards of famous matches and play them out considering the decisions required.  Instead of seeing training as a linear model, the chess board can be reset to allow a different decision.  This enables comparison. 

Execution Training

Execution training should expose the current operations cells to decision-making stressors and force them from the plan.  To do this, this the military must amend how it runs training exercises and focus on maximising decision-making opportunities, over getting exercising troops into a realistic battle rhythm.  

The benefits of field training are significant; but we should not lose focus on what we need to train.  The ability of operations teams to control a plan, not necessarily to operate in the cold.

A benefit of a type of staff training is that creates more decisions and thus more feedback.  This provides valuable data output that can be harvested, interrogated, and fed back.  In turn, this increases realism and can be used to challenge current assumptions.  This could deepen the pool of information that a future artificial intelligence could use. 

Focusing on decision making

Currently, command post exercises require the current operations team to track the battle rhythm reports and returns, as well as fighting the enemy.  There is little focus on decision-making.  Acknowledging that reports and returns are an important part of administrative routine, there are more important things the current operations team need to train.

A focus on decision making would see current operations cells put under stress by conducting two or three iterations of the same scenarios.  This would allow the current operations to make slightly different decisions along the way and test different methods.  The plan would remain the same, but the various decision points throughout could be explored.  This would result in a wealth of information to analyse and compare.  Reps, reps, reps.

To use a real-life example: All levels sports teams play training matches in which they run through their moves and different patterns of play.  The coaches will pause when they wish to discuss an individual’s decision or when they want to show everyone the bigger picture, the shape of the team, or to explain opportunities or threats.  They often send the ball back two or three phases to restart the action and to address the coach’s point.   Imagine this level of replay and the value it could bring to an execution team.

To achieve this would take more exercise planning and engaged higher and lower command cells to speed up the reset.  Yet, with the right system in place, exercises can be fast-forwarded through mundane routine to focus on critical areas and decision-making.  This would be mutually beneficial as the red forces would also benefit from more focused decision making training.

1. Operations of a system of systems are dependent on extremely rigid rules, roles, and responsibilities. Multi-domain seeks to construct a system-wide federation of interoperable platforms, capabilities, and enablers that have stand-alone value, can collaborate and remain operationally effective while absorbing failures and losses.
 
2. As a loosely coupled system, a multi-domain force design can quickly assimilate and use new capabilities as they mature.

3. Advanced information networks enable a multi-domain force design; it does not seek to connect all things all of the time. The interdependence of capabilities means they must have the ability to connect in a highly automated manner when needed. 

4. The distribution of too much data to too many entities in a network can slow its operations, since each entity will need to filter massive amounts of information to determine what is necessary. Sometimes less is more. 

5. Getting the right information to the right entity when needed does not require constant connectivity. What a multi-domain network does require is information processing at the combat edge. This will require smart tools and routers to identify data that specific entities need and the best path to pass the information to them.

6. Multi-domain does not reach back to pull information from a fixed distant control center Relying on rear-area processing can create risk that actors in a combat zone will not get the information they need. 

7. The physical distances involved also induce latency into the system that can mean the difference between mission success and failure. 

8. Increased computational power and faster processing speeds will allow the development of processing nodes that can be placed at the forward edge of an area of operation to push information to multi-domain operations.

9. Creating multi-domain will provide the means to break with old, well-ingrained approaches to waging war, including the tight, centralized command, control, communications, and execution practices that have become habituated. 
​
10. Changing training practices to ensure battle management officers are able to make decisions at the combat edge will be essential to multi-domain warfare. The force cannot win systems warfare in the information age with a single or even small handful of network or nodal entry points that impose too many timing problems into observation, orientation and decision time into operations and concede an important 
0 Comments

Top 10 Enablers Assist Commanders With Sharing Information to Execute Multi-Domain Operations

4/1/2020

0 Comments

 
​Military must use its limited time better to practice the execution of Multi Domain Operations.  Staff training serials are not maximised to enable units to make more decisions and to experiment.  There is a focus on a liner mission over the required training.  There needs to be a greater focus on staff execution training to establish the current operation. This is best achieved though iterative, incremental, changes to training design with more objective, feedback.  

The Multi Domain Operations vision is very different from the way mission tasking works today. “Right now, our commanders are very limited in who they can assign to do certain” things. “More often than not, you have to assign someone because they happen to be in front of a specific place in front of a specific computer.

Navy is making integration of ships, planes, sensors and weapons a priority going forward and is in the requirements-writing stage of development an integrated combat system. Today’s ability to pass information from sensors to operators at sea to fleet commanders ashore is not happening “at the speed of warfare.”

“We should be able to, with this level of technology, to take what we are seeing on the carrier should be available in the operations center in real time so that when a fleet commander is told, ‘these are my intentions and I’m engaging here,, they can look at it and go, absolutely, and I’m moving this or that or whatever the case may be to support you.

“We deserve better information to the decision-maker. The information is there, somewhere; we’re just not getting it in the right hands at the right speed.”

Operators must wholeheartedly adopt the belief that it is OK to fail and learn through mistakes AND design training that allows it to happen.  In its current form, however, the services lacks a method to replay mistakes and actively learn from them.  This would be made possible by improving our training practices and mind-set, focusing on feedback, mission design, and increasing acceptance of a wide range of views.

By testing some of these suggested changes, units may be able to experiment and train to failure. More reps would enable more creative decision making, and explore more diverse avenues than training currently allows us to do.

Organizational learning remains an important component of mission success. Militaries are unlikely to begin wars optimized for the challenges they will encounter. General-purpose forces often train for general combat operations rather than tailoring their personnel selection, training, and education for a specific threat. There are notable exceptions in areas of longstanding hostility, but even when units do train for specific threats, they are likely to encounter unexpected events or conditions. 

Even if a military begins a war optimized, it is unlikely to remain so. Because adversaries attempt to negate strengths and target weaknesses, successful techniques will become less effective, and the most successful sometimes become ineffective the fastest.

Militaries are unlikely to begin wars optimized for the challenges they will encounter…Even if an army begins a war optimized, it is unlikely to remain so.

Multi-Domain Operations permeates areas such as problem solving, decision making, communications, and promotions. It creates insiders and mavericks.

The established insiders of an organization usually like the culture as-is because it led to their success. They do not like change because it puts the mavericks on par, or worse, in charge. 

We say that we want Soldiers who can practice disciplined initiative, who can change the task to meet the purpose. We say we want shared understanding and teams built on trust more than rules. We want them to be risk tolerant.

Yet we do things that scream the opposite. Yes, attention to details matters. Yes, there is a balance between free expression and unit cohesion but building a risk-tolerant team does not happen by this sort of narrow conformity. No amount of mission command rhetoric can overcome “ Follow the rules, or else.” It is one example of how our culture typically eats mission command for breakfast.

This model is compelling because it is balanced and defies easy prescriptive answers. It is at once friendly to the tenets of mission command yet helps us remain grounded in and responsive to our need to steward a large and complex system. It offers a way for you and your unit to develop simple and concrete action steps toward prudent change without sacrificing what makes the unit work.

Diagnosing the Command Structure

One command model describes initiative, innovation, and risk. Opposite them are the people who like structure, predictability, and rules. 

Without rules initiative innovation and risk can devolve into destructive opportunism. On the other hand, without innovation, structure often calcifies. It is a competing values framework because organizations need the best of both values; they pull each other toward the center and toward a positive zone.

On the other side is culture focused on trust, commitment, and teamwork against a high-achieving, mission-focused “market” culture. On this axis the drive to compete can produce toxic leaders bent on winning at all costs, whereas a team too focused on keeping everyone happy can lose sight of the job that needs to be done. 

Soldiers—even after disaggregating for rank, unit type —are saying that they would prefer a Mission Command skew to the current command and control skew. It therefore begs the question: if we all want the same change, what is holding us back? 

We could easily blame “the system” and in many cases we would be right, but that is often a blanket copout. We have plenty of influence with our people in the smallest of every day events. To find out where and how requires that we move past diagnosis and into the next phase, namely discussing what change means and what it does not mean.

With that comes a warning: you your team are probably defining change in unexpectedly different ways. From my experience it is common that a infantry platoon leader and a executive officer will have different understandings of trust, boldness, winning, and what structures are necessary. 

Not sure? Ask people to define these simple words. It is quite revealing. Both might answer the survey in way that would suggest the same current and preferred culture. It is also possible that identical survey answers are not signs of agreement. Two respondents can have scores and graphs that look the same, but the values underlying their choices are quite different.

You do not want a unit culture where Platoon Leaders are free to write their own doctrine or Squad Leaders each developing their own counsel processes. To create a culture that works on trust, accepts risk, and encourages initiative, you need to clarify meaning and calibrate actions by setting clear left and right limits. You need to define the positive zone. The way to do this is to hold workshops where teams brainstorm exactly what they intend to change.

As the unit’s commander, you should moderate the discussion. As with the above example, you need to play the honest broker, but you should also be prepared to provoke the team. Use the session to ask questions and propose ideas that clarify meaning as well as offer approaches. 

For instance, if your team is at a loss for ideas and are offering only non-specific issues i.e., “training is not my  business,” then provoke them. Ask, “Does this mean officers should stop attending training meetings?” or better yet, “What if officers did stop going to training meetings?” 

Lastly, do not allow them to “assume away the problem” by suggesting “it will never happen” or perhaps worse, the unexamined “we already do that!” Your job is to stir the pot enough to generate different points of view or new ideas; ask “what if” or “why not” and see where the conversation goes. Foster shared understanding and ideas will emerge.

Developing an organizational culture that is at once amenable to the entrepreneurial spirit of mission command but responsive to the needs of the institution and its rules is not easy. The deliberative work around agreeing to what change means and does not mean as you review the data from your own organization will provide your team with the confidence necessary to “innovate” successfully.

 Learning to Disrupt Ourselves: Creating an Experimental Mode of Operation and How to Create It

The After Action Review. Like morning PT and issuing salutes, the AAR has become just something we do. And for good reason. It has served its purpose remarkably well. But is it optimized for the future battlefield?

While AARs use event-based feedback for evolutionary performance improvement, they are not designed for the discovery learning and experimentation central to multi-domain operations. Overcoming this limitation and harnessing tactical innovation requires creating new tools that complement the incremental performance benefits of AARs with practices from data science and a framework called behaviour-centered design.

Employing the right learning strategy requires first mapping problems along a continuum—something that reflects the concept of “known knowns,” “known unknowns,” and “unknown unknowns.

 While AARs are ideally suited to the “known known” domain of established challenges and solutions, they are fundamentally unable to address the ambiguity of the “unknown unknown” domain surrounding emerging warfare concepts. This transition from addressing the “known known” quadrant to effectively engaging with the “unknown unknown” domain moves from the “predictable path”—with its consistent outcomes, “right” answers, and linear chains of causation—to the “bold path” and the unprecedented frontiers of innovation.

AARs cannot make this transition in their current format because they ask the wrong people the wrong questions, in pursuit of the wrong outcomes, to handle ambiguity. The wrong question: “What should have happened?”

When AARs were introduced, their use of ground-level feedback was transformational. They enhanced training quality and led to objective evaluation benchmarks like the “task, conditions, and standard” format. This success led to widespread adoption, starting with combat training centers and then across the services as a primary driver of organizational learning.

The mass adoption, however, led to overreliance on a tool designed for the “false realities” of training environments instead of the operational complexity of the real world. This reality gap keeps expanding as the pace and technological complexity of modern battlefields increase.

The military has continually acknowledged changing operational environments—like volatility, uncertainty, complexity, and ambiguity to the more recent idea of wicked problems with “no definable problem statement, no objectively correct answer, and layers of uncertainty and unpredictability.” Yet, our learning tools lag decades behind.

The jump from learning in training environments to complex operational challenges requires identifying and examining a system’s underlying assumptions. This shift can be understood as moving from single-loop assessing if an outcome is achieved and double-loop learning evaluating the validity of the metric to triple-loop learning evaluating the cognitive processes that develop our systems and ask if we have the right targets.

Central to triple-loop learning is challenging fundamental assumptions and unveiling cognitive biases, which requires the use of open-ended questions like “how might we” execute a task or conduct an operation. This line of inquiry prizes divergent thought, stimulates experimentation, and identifies blind spots by focusing on first principles. This is not feasible with current AARs.

These  challenges aside, AARs have become underleveraged vestiges of innovation in the eyes of most military leaders. They are less a source of inspiration than final scripted box checks in the training model.  Making matters worse, their inability to drive discovery learning has triggered ever-decreasing subordinate buy-in, resulting in a culture more likely to accept the status quo than challenge norms.

The wrong people: “What did you learn?” Outside of combat training centers and some regional training institutes with dedicated observer coach/trainers, traditional AAR audience polling and information collection methods are insufficient to identify the areas requiring the most organizational attention and energy. This shortcoming is especially common in AARs at the battalion level and below, which often descend into a shallow, formal procedures lacking engaged dialogue .

When debate occurs, it often prioritizes key-leader inputs and lacks feedback from the most valuable subordinate components—user feedback from the primary training audience. For example, AARs at the division and corps levels often focus on senior-leader dialogue despite much of the work being done by subordinate staffs and command teams who may not be present.

In cases where subordinate staffs do conduct AARs, they are often informal and not shared with the rest of the organization  due to knowledge-management challenges. This lack of sharing amplifies gaps in awareness and shortcomings in collective learning.

Although leader-centric AARs offer forums for decision makers to publicly issue guidance, they risk awareness gaps since leaders rarely participate in every part of training given competing requirements. While leaders can partially mitigate this shortcoming through battlefield circulation, these efforts are generally ad hoc, informal, and rarely capture sufficiently diverse feedback to form a representative sample of all stakeholders.

These shortcomings are compounded by time constraints as AARs are often executed amid competing leadership requirements, limiting the context available to inform effective decision making. Strategies to gather further context during AARs, like guiding discussion with subordinate-generated talking points, still encounter time constraints. The outcome is an environment where the people who know what needs to change are not asked, and those that are asked do not know what to change.

The wrong outcomes: “What should we have done differently?” Another AAR shortcoming is associating successful execution with outcomes like definitive guidance from key leaders on how to “fix” issues. The reason expecting key leaders to issue guidance is so dangerous in experimental-learning environments is exactly why it’s so powerful in other contexts—their responses are experience based.

Decision-making challenges aside, our institutional knowledge management practices often conflate organizational learning with filling vast and soon forgotten shared-drive folders with AAR documents. While units may have terabytes of “lessons observed,” the process of turning these challenges into refined solutions is often difficult, preventing them from becoming shared “lessons learned.” 

1. A multi-domain architecture, by definition, has numerous nodes and networks, dramatically expanding the potential surface attack area, Information can be safeguarded on “zero-trust” networks , and there likely are other, more innovative ways to evaluate, filter, or even quarantine corrupted nodes and networks. 

2. Multi-domain will increase the ability to adapt. While multi-domain is not a mesh-type network, its functional architecture will be quite complex and include many potential pathways

3. Nodes with the capability to automatically discover, connect, and identify the information needs of other elements; the ability to adapt when elements are degraded or denied; and the ability to integrate new capabilities automatically. 

4. There will be no single network or standard required to govern the architecture that could constrain the incorporation of emerging technologies and lock it into technological obsolescence. 

5. Developing the ability to translate data, transform waveforms or other types of links, and integrate new nodes in real time without major gateway nodes will be essential to creating an adaptive force design. 

6. Multi-domain makes use of recent developments in automation and artificial intelligence and enables rapid adaptation and execution of its functions. 

7. These tools will be embedded in every platform and optimized for their function. Identifying the many combat functions of elements of a multi-domain force design, developing and training the learning systems will be critical to the success of a force design. 

8. New technology provides the means for dealing with more complex architecture. Building a large, diverse force of disaggregated elements will create concerns over their sustainment and lifecycle costs. 

9. Diversity in platforms means diversity in spares, equipment, training, and other operational expenses. However, many multi-domain elements will be smaller than large multifunctional platforms, and their physical and functional systems may be less complex, easier to diagnose, and easier to repair. 

10. The military logistics directorates will need to assess and develop new ways to sustain a large, heterogeneous force.
0 Comments

    Site Visit Executive

    Provides Periodic Equipment Sustainment Programme Reports Assess Help Desk, Training, Workshops & Clinics for Product Information & Reliability Types, Systems Upgrades, Communication Links & Supplier Participant Retrieval Status

    Archives

    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020
    August 2020
    July 2020
    June 2020
    May 2020
    April 2020
    March 2020
    February 2020
    January 2020
    December 2019
    November 2019
    October 2019
    September 2019
    August 2019
    July 2019
    June 2019
    May 2019
    April 2019
    March 2019
    February 2019
    January 2019
    December 2018
    November 2018
    October 2018
    September 2018
    August 2018
    July 2018
    June 2018
    May 2018
    April 2018
    March 2018
    February 2018
    January 2018
    December 2017
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017
    January 2017
    December 2016
    November 2016
    October 2016
    September 2016
    August 2016
    July 2016
    June 2016
    May 2016
    April 2016
    March 2016
    February 2016
    January 2016
    December 2015
    November 2015
    October 2015
    September 2015
    August 2015
    July 2015
    June 2015
    May 2015
    April 2015
    February 2015
    January 2015
    December 2014
    April 2014
    January 2014
    December 2013
    November 2013
    October 2013
    September 2013
    August 2013
    June 2013
    May 2013
    April 2013
    March 2013

    Categories

    All

    RSS Feed

Web Hosting by Dotster