Envision a scenario wherein dismounted infantry soldiers are taking heavy enemy fire while clearing buildings amid intense urban combat -- when an overhead drone detects small groups of enemy fighters hidden nearby, between walls, preparing to ambush.
As the armed soldiers clear rooms and transition from house to house in a firefight, how quickly would they need to know that groups of enemies awaited them around the next corner?
Getting this information to soldiers in seconds can not only decide victory or defeat in a given battle, but save lives. What if AI-enabled computer programs were able to instantly discern specifics regarding the threat such as location, weapons and affiliation by performing real-time analytics on drone feeds and other fast-moving sources of information, instantly sending crucial data to soldiers in combat?
ATR-MCAS is an AI-enabled system of networked, state-of-the-art air and ground vehicles that leverage sensors and edge computing. The vehicles carry sensors enabling them to navigate within areas of interest to identify, classify, and geo-locate entities, obstacles, and potential threats which reduces the cognitive load on Soldiers.
The system is also capable of aggregating and distributing the target data, which can then be used to make recommendations and predictions based on the combined threat picture provided.
Services are developing AI-based tool for gathering intelligence on potential targets. While previous AI efforts focused on recognizing objects in images., the new project will integrate a variety of data from diverse sources to construct a fuller picture by combine sensor input with operational experience to create an understanding of what’s going on.
We’re doing two things, analyzing the video for object identification and classification and then the second is contextualizing that information. If the aircraft has been spotted before, that information can be added to the fact that you just spotted it now, and you can determine it’s the same one based on speed or other factors.
Mission Data File optimization threat library is the data on board that proactively notifies the pilot of the aircraft of upcoming threats. The problem today is that it takes way too long to actually generate that Mission Data File. We can apply the data aggregation capabilities and AI to make that process an order of magnitude faster so the data are more current.
The process today is heavily manual, largely because the data sources and types are so diverse. “The analyst today would have to go data source by data source and then, within data source, data field by data field, looking, for instance, to see if this database here has this field for an object in the theatre.
Much of the data is highly unstructured, such as comments in text, that software doesn’t work well with. The hope is to automate the process of looking through sources and present the operator with a list of problems, such as potential deficiencies in intelligence, and recommendations
“This project pushes the existing limits of artificial intelligence and machine learning used for image classification and autonomous navigation.”
“ATR-MCAS is different than existing autonomous system efforts because it is not limited to specific-use cases. It can be used to perform reconnaissance missions across the area of operations, or maintain a fixed position while performing area defense surveillance missions.” ATR-MCAS capabilities also extend to other ground warfare missions such as route reconnaissance, screening missions, or the verification of high-value targets.
Once identified by the autonomous sensors, the basic information about the identified threats are relayed back to the Soldiers through a mobile network.
Threats are advertised in a common operating picture COP that provides an aggregated view of the battlefield and the information is then processed by an AI-enabled decision support agent, which can make recommendations such as the prioritization of the threats for Soldiers to utilize. Such information gathered will be achieved not by static data standards, but by robust data mediation, which allows for improved interoperability across ground and air systems.
Next-generation headsets are expected to have greater data collection capabilities for training scenarios, and to utilize AI. “That’s a case where a machine is going to be doing a lot of this work in terms of figuring out how you’re doing and helping you to improve your performance.
This could be anything from helping you with decision options, helping you with information, or helping you pull the right visualizations. The smarter the support stuff gets, the more benefit the user gets out of it.” While current technology can today perform some of these functions, what if this data was provided to individual dismounted soldiers in a matter of seconds? And instantly networked?
Operating in a matter of milliseconds, AI-empowered computer algorithms could bounce new information off vast databases of previously compiled data to make these distinctions--instantly informing soldiers caught in crossfire.
The Army calls this overall process “Soldier as a System”...the concept of using computer networking and the latest algorithms to seamlessly integrate otherwise disconnected nodes operated by soldiers. Specifically, this means a single electronic architecture will connect night vision goggles, individual weapon sites, wearable computers, and handheld devices showing moving digital maps and time-critical intelligence data.
Information from all of these otherwise separate soldier technologies, which can also include acoustic and optical sensors or mobile power sources, such as batteries, is naturally interdependent and interwoven in crucial battlefield circumstances.
An ability to use various applications of autonomy and AI to create instant information-sharing in war, changes the tactical and strategic circumstances confronted by individual soldiers, massively improving prospects for survival.
“The use of autonomy will assist in assimilating data from these various systems and quickly provide useful options to command decision makers including individual soldiers. Over time, more and more new intelligent technologies will be introduced and will continually change the nature of the battlefield and the very nature of the tasks the Soldiers perform.
The headsets will be used to help equipment maintainers. “We can get to the point where I can look at the toolbox, the device itself can do artificial intelligence recognition at the edge, look into the toolbox and identify the tool that I need for the next step in the repair and highlight that tool for me. I pick that tool up, look at the airplane, and now the next step in the checklist for that repair is holographically displayed.”
One such device that technicians put through its paces is a robotic platform that soldiers or Marines can guide by using hand and body movements to enter buildings, tunnels or other areas to navigate and map in either a guided or autonomous mode
Some of those efforts tackled problems that include detecting unmanned aircraft systems, accurately mapping areas and structures, maintaining communications with and awareness of friendly activities, protecting friendly forces from explosives or vehicles, and detecting indications of potentially hostile activity.
“In the future we will protect our soldiers by removing them from the effects of an initial undetected threat. Virtual presence, by remote autonomous systems, is a means which will allow us to realize this. The robot will be hit first.” The robot that could move via troops’ hand and body movements does its mapping and navigation functions using LiDAR and other sensing.
The operator guided it using mixed reality, looking through a head-mounted augmented reality display -- and recognizing a soldier’s hand gestures with help from a “hand-pose detection glove”. The goal was to pair an autonomous ground robot with a soldier-equipped augmented reality to provide situational awareness.
“Ultimately we will use this information to create capabilities that increase mission success while reducing risk to our military forces in urban operations,”
The program is aimed at taking the burden off soldiers by carrying water, ammunition, batteries and other heavy items needed to sustain a squad in remote environments.
But, the platform can also be tailored to specific missions such as running remote weapons stations, casualty evacuation and launching unmanned aerial systems or conducting reconnaissance.
Soldier evaluation has truly been a critical part of the CRS-H program from the outset, and has helped compress the time it takes to field a modernized capability that meets the needs of the Soldier. "We develop equipment for Soldiers to use in demanding situations, and there is no substitute for their perspective in operating the system - their input is of utmost value
1. Appealing to a Variety of Training Styles
2. Offering Experiences That Promote Repetition and Retention
3. Eliminating Risk and Safety Concerns
4. Reducing Training Budget and Providing Scalability
5. Delivering Results to a Wide Range of Industries
6. Removing Time and Travel from the Equation
7. Create Scenarios That Otherwise Are Impossible To Create
8. Focus On A Practical Approach
9. Encourage Trainees To Learn From Their Mistakes
10. Allow For Self-Guided Exploration