Soldiers love simulators. It doesn’t take any particularly sophisticated methods of data capture or monitoring to make that clear, as Matt Lewis knows well. While researching the effectiveness of different types of simulation-based collective skills training in the US Army, he upset quite a few tank crews. “I’ve had soldiers tell me, ‘Do not take the Close Combat Tactical Trainer (CCTT) away,’” he laughs. “I mean – literally threatening me.”

It’s not that Lewis doesn’t think synthetic environments are important tools for training complex skills. “I’m a technology guy,” he clarifies. “I call myself a technophile. But I’m also a recovering technophile, as I’ve learned it’s not the hammer for all solutions. You need the right technology for the right application.”

In fact, Lewis’s main problem with the roughly 25-year-old CCTT – a distributed-processing system consisting of computer-driven, manned module vehicle simulators – isn’t necessarily about the technology at all. Each of the seven CCTT fixed facilities’ 30-plus military vehicle simulators make it possible for crews to practice the complex skills required to act and manoeuvre as a unit in a range of different scenarios. “But this is the shocking part: it’s really underused,” he reveals. “You’ve got this incredible training environment that allows you to get in a tank and drive it around, so why would it be really underused?”

As a senior research scientist at the Pardee RAND Graduate School, Lewis has to suppress his most technophilic tendencies and evaluate simulated training environments from the perspective of a cognitive psychologist. As a technology that can be inserted into those environments to record users’ eye movements and produce data that helps researchers and trainers understand their thought processes, eye tracking should win his approval twice over. Such monitoring can accurately identify brain states such as fatigue or high concentration, and there’s no better way to evaluate ‘latent’ situational awareness and information acquisition activity for pilots, drone operators or soldiers on patrol.

“Drone operators often say it’s really hard to pick out good guys from bad guys from fuzzy camera feeds,” Lewis explains. “So if I can put good guys and bad guys on screen in ways that can help train you to recognise the differences between them, having eye-tracking data is going to be really important.” Indeed, by tracking how a range of operators complete such tasks across a number of different scenarios, the data could be used to track differences in their visual fixations, saccades and smooth pursuit movements, as well as their raw target acquisition abilities. As such, training programmes could get a better insight not just into who its best students are, but also why they succeed.

It’s a long way from the CCTT, but Lewis’s favourite use of eye-tracking technology actually helped to solve a previously intractable mystery. A Belgian study into children’s ability to solve word problems like “Johnny has four apples and Sally has 15 apples, so how many more apples does Sally have than Johnny?” showed that the children who struggled did so not due to their problem-solving abilities, but how they looked at the questions. “The eye tracking very clearly demonstrated that they get to the first number, then they read to the second number, and then they stop reading,” he explains. “The data shows the brutal quantitative reality that they never read the end of that sentence. They never saw the problem they had to solve.”

Data limitations

The Interservice/Industry Training, Simulation and Education Conference (I/ITSEC) in Orlando, US, is the largest event of its kind in the world. Lewis has spent a lot of time in its exhibition hall, which is “huge, loud and very high-tech”. The world’s leading hardware providers bring their best immersive environments – and the glowing testimonials to match, but most lack quantitative evidence of their effectiveness. The last time Lewis was there, the only simulator that had the data he was seeking was for welding. That said, a vehicle simulator without these capabilities seems like a good deal when you can’t always go and drive 14 tanks through a suburban area to put soldiers through their paces. “That’s when people are willing to throw incredible amounts of money around and not be demanding consumers,” Lewis sighs. “They’re not thinking whether it’s an effective, and especially a costeffective, training environment.” He urges military buyers to step back and think more scientifically about the problems they need to solve. “Look for quantitative data that says how much users are going to learn, because at some point it’s going to come down to dollars per training hour per soldier, and each one can end up costing a huge amount of money.”

Despite the hundreds of millions of dollars the CCTT has cost the US Department of Defense over its lifetime, and despite the dedicated civilian staff that works to keep it running, its scope is limited and access is difficult. “You have to sign up for it, and then you have to get all your people there,” Lewis explains, “and then the equipment may not even match what you have at your motor pool.”

On one telling occasion, Lewis was in the control room at a CCTT site when the simulation crashed. Spotting the opportunity to canvass more soldiers, he went and knocked on the door of one of the 10ft cubes that house the individual vehicle models. All those inside thought the CCTT gave them a valuable training opportunity, even with its proclivity for unexpected stoppages. Then it turned out that two of the four also played online tank games in their spare time. “They said that they could learn some of the same kind of things from games on their laptops,” recalls Lewis. “They’ve already found these other training environments, and they’re out driving tanks and learning things like ‘don’t silhouette yourself on a hill if you don’t want to get killed’.”

Lewis isn’t saying World of Tanks is superior to the CCTT as a technology for training, but rather that training technologies with lower levels of physical fidelity can still be effective if they offer high psychological fidelity, particularly if they’re easier to use and allow for more repetitions.

“You cannot start with the technology,” he stresses. “The technology is the tool you use to get the fidelity you need. That’s the opposite of how almost everyone approaches this, but you need to start with the end in mind and ask what it is you need to teach. From there you can ask if you need realistic wind, or to be jiggled around, or to smell burning trash on the streets of Mogadishu. And the answer, probably, is no.”

The approximate horizontal amplitude for human binocular vision.
Journal of Rehabilitation and Assistive Technologies Engineering

Eye to eye

To employ it effectively, those interested in using eye-tracking technology need to take a lesson from eye-tracking data and make sure they look closely at the training problems they need to solve. At present, lower-fidelity technologies offer some of the same functionalities. A video screen could be combined with either eye-scanning cameras or mouse inputs to assess a soldier’s ability to identify possible threats while on patrol, for instance. Whereas the cameras would do so by tracking eye movements, the soldier could use the mouse to click on issues as they arise. The latter is far cheaper and easier to repeat, but introduces an artificial step that could potentially interfere with the skills that trainers are looking to assess and enhance.

The precision required for an eye-tracker to record detailed fixations and accurately measure saccades.
Journal of Rehabilitation and Assistive Technologies Engineering

“So,” Lewis asks, “what’s the benefit in trying not to interfere with the process versus having people do this artificial task?” Well, while a mouse can be used the moment it’s grasped, a recent survey of modern eye-tracking devices found calibration times can vary from as low as two seconds to as long as nine minutes. Moreover, blinking doesn’t affect the accuracy of hand inputs, and whereas a mouse is endlessly interoperable, eye-scanners need to be programmed to work with specific training simulations and environments. Still, Lewis argues that “under the right conditions, looking at and analysing eye-tracking data can give a richer trace of the underlying cognitive process”.

Lewis compares this to eliciting ‘verbal protocols’ during tasks like reading and solving physics problems. By asking capable readers and problemsolvers to verbalise their thought processes as they happen, researchers have identified the metacognitive skills that make people good learners. Importantly, those skills have less to do with the information people are given than the way they elaborate and build cognitive frameworks around it. There’s already evidence of something similar happening with eye movements, as numerous studies have shown that they are a highly reliable indicator of a pilot’s expertise. Research from SUNY Downstate Medical Centre has provided compelling evidence that when novice pilots watch the eye movements of expert pilots via eye-tracking technology, they rapidly improve their visual scanning strategies in the cockpit, even without explicit instructions. By making the ocular protocols of expert pilots visible, it seems eye-tracking technology might be the best way to move from making trainees memorise what to do to helping them understand how to act. That isn’t possible with a mouse.

Nevertheless, there is currently a lack of guidelines and criteria for using and interpreting eye-movement measures. Even if there weren’t, a standardised way of assessing eye-tracking data might fail to account for the highly idiosyncratic way each person looks at their environment, particularly when they’re good at processing it. Militaries also need to ensure they find the right eye-tracking equipment to maximise training time, because the cognitive frameworks developed by good learners are bolstered and secured by repetitions across different scenarios.

That works for both the users and the trainers. “At some point the question is how much data do you need, and for what do you need that data,” says Lewis. “For data people, and I’m one of them, more data is better. Give me more.”