VR may prove useful as a way to provide cheap and convenient training for maintenance tasks. Using off-the-shelf motion sensors, an HTC Vive headset and controllers with a trigger and sensing pad, the program takes a service member step-by-step through an aircraft repair job, from diagnosing the problem to re-testing the part after the fix to make sure it works.
The system is still a prototype, but the plan is to develop an automated guide so troops can train on key tasks with little oversight, wherever they are.
In the loaded virtual reality maintenance scenario, a windshield washer pump needed to be replaced in a Navy P-8A Poseidon reconnaissance aircraft. Using the controllers, users could flip switches to test the pump, then perform the needed maintenance step-by-step in a 360-degree simulation of the aircraft.
“Certainly, you could do it with just about any aircraft, things that require troubleshooting,”
There are some things the system won’t do well. It can’t simulate resistance for more strenuous maintenance tasks, and a user can’t feel around in some area out of view to find a part, the way a maintainer might in a hard-to-reach area. But in an era of high operational tempo, when senior maintainers might be deployed or otherwise unavailable to train more junior troops, engineers envision the system will allow troops to meet training goals and maintain proficiency wherever they are.
The system is designed to be lightweight and easily deployable. Troops can complete a virtual training session, then send a video of the session to a supervisor located anywhere in the world for approval or correction.
Development of the system is still in the early stages, but the system has so far received a warm reception at demonstrations for Air Force and Marine Corps audiences.
Army is gearing to launch the first iterations of its new virtual reality simulators, which will lay the foundation for synthetic training environments at multiple bases.
A squad advanced marksmanship trainer will be delivered to several Army locations next year for close-combat troops. A squad immersive virtual trainer will closely follow.
The building blocks that will become the synthetic training environment, or STE, will eventually include computer-generated avatars incorporated into the battlespace, among other virtual military elements.
The surroundings the trainers simulate will represent real environments around the globe, from "mega-cities" to dense urban areas.
The service is collecting data to reconstruct cities, mountainsides, bunkers and more to more accurately represent what soldiers will see in the virtual-reality environment. Officials said that poses a challenge, but service members must get an accurate representation of what they may face in combat.
Soldiers will be exposed to more realistic combat scenarios, "enabling units to enter live training at a much higher level of proficiency. The goal is to rely less on bulky hardware for simulations, and more on software and networks, including virtual reality goggles and iPads for streaming services.
While the Army is looking for more personalised training, the new, simulated environments are intended to boost the collective squad, which would face a high-end threat together.
Army is looking at it from a collective -- a squad, a crew, a team, a platoon and then on up. But we have to get the individual piece correct in order to be able to do that.
Referencing the service's unusually swift acquisition effort and collaboration with industry, cross-functional team had been asked to be disruptive, and Army believes they have done just that.
Training is changing as the Army pursues dynamic live, virtual, and mixed-reality training that offers data analysis supported by artificial intelligence and other smart systems.. Being able to take the data from your training to be analyzed for trend analysis and predictive analysis is going to be a game changer."
Let's say there's a four-man team preparing to clear a building in a training exercise. As the first man busts through the door, a biometric feedback sensor indicates that his adrenaline spiked off the charts while muzzle and eye tracking sensors showed the soldier looking one way while his gun pointed another. When the third man enters, a motion sensor indicates that he froze momentarily.
And all this data is being run through machine learning systems for trend and predictive analysis, producing a readiness score for essential tasks. Imagine soldiers training to fight augmented reality adversaries in virtual battle spaces, showdowns that like video games can take place in cities around the world.
"We have these abilities, and have seen it from our industry partners. Instantaneous feedback. While the Army is not there yet, the service is quickly moving in that direction.
Soldier lethality is one of the priorities of the newly-established Army Futures Command, a new four-star command focused on rapid research and development for future weapons and warfighting capabilities, as well as enhanced training options.
"There are systems that we're looking at that can allow the soldiers to train as they will fight, train where they will fight and train against who they will fight while back in the home-station training environment.
One option for the Army is next-level synthetic training environments, where troops can train individually or in groups in both fixed or mobile live, virtual, or mixed-reality battle spaces of all sizes.
This is a big deal given the inadequacies of some of the existing training platforms. The current training systems are limited in their capabilities. For example, the technology for the existing virtual trainers does not allow the Army to bring in all of the enablers, such as logistics, engineering, and transportation teams.
“We can only bring air, ground platforms, and a few other capabilities. We need to train combined arms to prepare for large-scale combat.
Terrain is also a huge challenge. "We are trying to get to one-world training," the general introduced. "Terrain is a stumbling block We are trying to get after that quickly."
User assessment testing for re-configurable virtual trainers began earlier this year. Within the next two years, the Army wants AI-driven trend and predictive analysis based on biometric and sensor data collected during training exercises.
"Right now, we are only as good as someone's experience and their eye and what they catch or what we see in video. "We want to be able to assess training, and we have some of that capability right now, but not to the degree we need."
For much of the U.S. military’s history, live training has been key to preparing personnel for their missions. However, staging a live training event can consume significant physical and fiscal resources, from aircraft, ground equipment and ships to all the personnel involved. Plus, the risk of accidents resulting in damage to equipment, or worse yet, endangering personnel, can increase.
That’s why the military started utilising virtual training to provide many of the same positive benefits while minimising the negative impacts of live training. These benefits, including personnel safety, readiness improvement and cost reduction, have led the military to take training a step further and utilise live, virtual and constructive, or LVC, training that brings together multiple systems using networking and even cloud capabilities.
LVC training allows personnel not physically present at a live training event to participate virtually and through constructive simulations that inject battlefield effects and simulated or constructed threats into live systems.
A recent example of LVC training is the Air Force’s investment in a common software architecture for its training simulators, creating the Simulator Common Architecture Requirements and Standards program. Also, the Navy, Marine Corps and Air Force are all looking to connect simulators and live assets to enhance air warfare training.
As LVC technology advances, commercial off-the-shelf technologies play an increasingly critical role. By leveraging the advances in commercially available IT, DoD can gain significant advantages, including reduced development and deployment times as well as the ability to reuse capabilities to gain significant efficiencies. Advanced server technologies and cloud capabilities can maximise reusability and rapid reconfiguration of infrastructure for numerous training needs.
As the military continues to explore the use of LVC training and simulation, and blends real equipment and personnel with virtual assets, commercial off-the-shelf IT capabilities will enable high fidelity, speed and immersive training experiences to grow skills and develop proficiency for our military forces.
Identify and mitigate risks quickly: To keep up with evolving threats, an intent-based network can serve as both a sensor and enforcer of security policy, leveraging artificial intelligence and machine learning to move at machine speed and counter advanced threats. Software-defined networks can also provide the ability to rapidly reconfigure given changes in real-world conditions or across various training scenarios.
Reduce the attack surface: Zero trust or white list segmentation can greatly reduce a cyber adversary’s maneuverability within an operational space in the event of an attack. Maintain an accurate and timely view of the threat landscape, segment access based on roles for devices, people and applications, and utilise security policies that are software-driven to support rapid changes based on threats and real-world environments.
By combining LVC with the right network strategy, DoD can securely achieve significant benefits in costs and efficiencies, as well as lower stress on existing systems, reduce wear and tear on operational systems, and decrease the chance of mishaps, which can occur using traditional live training. Building LVC capabilities on a sound network architecture minimises risk and ensures the mission is accomplished.