Post

Mixed-reality systems can bring soldier feedback into development earlier than ever before. Heres how the US Army is using it.

ABERDEEN PROVING GROUND, Md. — The U.S. Army’s Combat Capabilities Development Command has made clear it wants to introduce soldier feedback earlier in the design process, ensuring that new technologies are meeting users' needs.

“Within the CCDC, the need to get soldier feedback, to make sure that we’re building the appropriate technologies and actually getting after the users' needs is critical,” said Richard Nabors, acting principal deputy for systems and modeling at the command’s C5ISR Center (Command, Control, Computers, Communications, Cyber, Intelligence, Surveillance and Reconnaissance).

During Project Convergence 2020, U.S. Army Futures Command was able to test out AI-enabled systems like Dead Center and the Maven Smart System and Algorithmic Inference Platform to process sensor data on board and automatically detect potential threats. (U.S. Army photo by (Spc. Jovian Siders, 92nd Combat Camera Company.)

“There’s a concerted effort within the C5ISR Center to do more prototyping not just at the final system level ... but to do it at the component level before the system of systems is put together,” he added.

But how can the service accomplish that with systems still in development?

One answer: virtual reality.

The Army’s CCDC is testing this approach with its new artificial intelligence-powered tank concept: the Advanced Targeting and Lethality Aided System, or ATLAS. While tank operations are almost entirely manual affairs, ATLAS aims to automate the threat detection and targeting components of a gunner’s job, greatly increasing the speed of end-to-end engagements.

Using machine-learning algorithms and a mounted infrared sensor, ATLAS automatically detects threats and sends targeting solutions to a touch-screen display operated by the gunner. By touching an image of the target, ATLAS automatically slews the tank’s gun to the threat and recommends the appropriate ammunition and response type. If everything appears correct, the gunner can simply pull the trigger to fire at the threat.

The process takes just seconds, and the gunner can immediately move on to the next threat by touching the next target on the display.

ATLAS could revolutionize the way tank crews operate — at least in theory. But to understand how the system works with real people involved and whether this is a tool gunners want, CCDC needed to test it with soldiers.

The Army has set up an ATLAS prototype at Aberdeen Proving Ground in Maryland, and it hopes to conduct a live-fire exercise soon with targets in a field. However, to collect useful feedback, CCDC is giving soldiers a more robust experience with the system that involves multiple engagements and varying levels of data quality. To do this, the command has built a mixed-reality environment.

“It gives us the opportunity … to get the soldiers in front of this system prior to it being here as a soldier touchpoint or using the live system so we get that initial feedback to provide back to the program, to get that soldier-centric design, to get their opinions on the system, be that from how the GUI is designed to some of the ways that the system would operate,” explained Christopher May, deputy director of the C5ISR Center’s Modeling and Simulation Division.

The virtual world

In the new virtual prototyping environment — itself a prototype — users are placed in a 3D world that mimics the gunner station while using a physical controller and display that is a carbon copy of the current ATLAS design. The CCDC team can then feed simulated battlefield data into the system for soldiers to respond to as if they were actually using ATLAS.

Like most virtual reality systems, the outside looks less impressive than the rendered universe that exists on the inside. Sitting down at the gunner’s seat, the user’s vision is enveloped by a trifold of tall blue walls, cutting the individual off from the real world. Directly in front of the chair is a recreation of ATLAS' touch-screen display and a 3D-printed copy of the controller.

Putting on the virtual reality headset, the user is immersed in a 3D rendering of the ATLAS prototype’s gunner station, but with some real-world elements.

“We’re leveraging multiple technologies to put this together. So as the operator looks around … he has the ability to see the hand grips. He also has the ability to see his own hands,” May said.

All in all, the mixed-reality environment creates the distinct impression that the user is in the gunner’s chair during a real-life engagement. And that’s the whole point.

It’s important to note the virtual reality system is not meant to test the quality of the AI system. While the system populates the virtual battlefield with targets the same way ATLAS would, it doesn’t use the targeting algorithm.

“We’re not using the actual algorithm,” May said. “We’re controlling how the algorithm performs.”

Switching up the scenarios

Another advantage to the mixed-reality environment: The Army can experiment with ATLAS in different vehicles. CCDC leaders were clear that ATLAS is meant to be a vehicle-agnostic platform. If the Army decides it wants ATLAS installed on a combat vehicle rather than a tank — like the current prototype — the CCDC team could recreate that vehicle within the simulated environment, giving users the opportunity to see how ATLAS would look on that platform.

“We can switch that out. That’s a 3D representation,” May said. “This could obviously be an existing tactical vehicle or a future tactical vehicle as part of the virtual prototype.”

Cadet 1st Class Cade Cavanagh uses a virtual reality system to practice flying skills during a Pilot Training Next course in Summer 2019 at the Air Force Academy airfield. Capella Space's Synthetic Aperture Radar images could be used in future virtual reality scenarios. (Jennifer Spradlin/Air Force Academy)

But is the virtual reality component really necessary to the experience? After all, the interactions with the ATLAS surrogate take place entirely through the touch screen and the controller, and a soldier could get an idea of how the system works without ever putting on the headset.

May said that, according to feedback he’s received, the virtual reality component adds that extra level of realism for the soldier.

“They thought it added to their experience,” May said. “We’ve run through a version of this without the mixed reality — so they’re just using the touch screens and the grips — and they thought the mixed reality added that realism to really get them immersed into the experience.”

“We’ve had over [40 soldiers] leveraging the system that we have here to provide those early insights and then also to give us some quantitative data on how the soldier is performing,” he added. “So we’re looking from a user evaluation perspective: Again, how does the [aided target recognition] system influence the soldier both positively, potentially and negatively? And then what is the qualitative user feedback just of the system itself?”

In other words, the team is assessing how soldiers react to the simulated battlefield they are being fed through the mixed reality system. Not only is the team observing how soldiers operate when the data is perfect; it also wants to see how soldiers are impacted when fed less accurate data.

Soldiers are also interviewed after using the system to get a sense of their general impressions. May said users are asked questions such as “How do you see this impacting the way that you currently do your operations?” or “What changes would you make based off your use of it?”

The virtual prototyping environment is an outgrowth of CCDC’s desire to push soldier interactions earlier in the development process, and it could eventually be used for other systems in development.

“We’re hoping that this is kind of an initial proof of concept that other programs can kind of leverage to enhance their programs as well,” May said.

“This is a little bit of a pilot, but I think we can expect that across the C5ISR Center and other activities to spend and work a lot more in this virtual environment,” added Nabors. “It’s a great mechanism for getting soldier feedback [and] provides us an opportunity to insert new capabilities where possible.”