I ask RAF Wing Commander Keith Dear what the future battlefield might look like as drone development accelerates. He describes how techniques based on animal mimicry are driving a rapid and diverse development of exciting new unmanned vehicle (UV) designs, how new models suitable for many divergent tasks are already beginning to operate in the skies above the world’s geopolitical hotspots. Scientists and engineers working for the Defense Advanced Research Projects Agency in the US recently built a nano-hummingbird surveillance drone. There are rumours they might have built a mosquito-mimicking designs to be able to illicitly collect DNA samples from unsuspecting subjects – Dear notes that this is likely science fiction for now – but that the development of such an aircraft is conceptually plausible. Researchers are even envisioning creating swarms of drones to work together in combat. The aerial battlefield of the future will soon resemble an incredible dystopian zoo, at least in Dear’s view, an intelligence professional whose expertise lies in conducting intelligence, surveillance and reconnaissance (ISR) that analyses systems and human behaviour.

Dear was founder of a not-for-profit start-up dedicated to delivering humanitarian aid by cargo drone (originally focused on Syria’s sieges), and is a long-time professional observer of drone development. He sees today’s growing capabilities across a range of different niches eventually leading to increasing diversification among the drone designs of the future.

“Much as with manned aviation, unmanned aircraft will need to be tailored to task, threat and budget,” he explains. “In my view, unmanned aircraft will need to be specialised to out-compete those of our adversaries, and I don’t see a single adaptable platform being the basis for that. Just as with manned aviation, they will need to be designed for the altitude, environment and threat level they need to operate at, to have the speed, manoeuvrability, lift, survivability and stealth required for their given mission.”

In his opinion, there is a looming difference between the limited advantages UV development would give modern militaries in a theoretical battle against a peer or a near-equal competitor, and the more complete dominance it could give them against non-state actors in the chaos of the urban guerrilla warfare, by, in essence, creating a totalising tracking system monitored by ‘smart’ machines. But many of these same surveillance and intelligence systems being developed for unmanned aerial vehicles (UAVs) come with ethically troubling implications for a variety of institutions involved in security issues.

Looking ahead, I would suggest we’ll see more and more incorporation of behavioural identification algorithms and psychographic profiling based on visual observation.

A drone of contention

Dear sees UVs combining advancing sensor technologies in one package with autonomous smart AI programmes, mobility and a lethal strike capacity. “It’s the matching of [UAV] sensors to [human] psychology that really interests me,” he says. “Perhaps I wouldn’t say that it excites me – I worry about the downsides of these developments too – but it has huge implications for intelligence staff, law enforcement, and policymakers. It’s going to give us insight that goes beyond anything we’ve had before, and with that are going to come an awful lot of difficult ethical dilemmas for us all to wrestle with.”

He adds that machine learning is going to play an ever-greater role in conducting first-line analysis in counterinsurgency and law enforcement operations, which are frequently blending together in an era of criminalised insurgent groups. There are already machine-learning algorithms capable of spotting ‘covert convoys’, that is, seemingly unrelated vehicles in traffic that are in fact coordinating their movements.

Dear anticipates that in the present – and future – urban war zones of the 21st century, we will soon see spy drones spotting these types of patterns and responding in real time – zooming in for a closer look at the drivers for example, or gaining altitude and detailing other UAVs to track all vehicles to their destination points covertly.

“Looking ahead, I would suggest we’ll see more and more incorporation of behavioural identification algorithms and psychographic profiling based on visual observation,” Dear predicts. “This includes UV sensors spotting behavioural tells in individuals and groups to enable preventative intervention. At this year’s surveillance conference in Shenzhen, China, drones were showcased that could be used to hover above crowds and detect individuals becoming violent from behavioural signatures such as aggressively raised arms.

“These sensors are here now and will become more sophisticated – perhaps adding in changing body temperature and heart-rate, gait, and facial expression. There are more and more studies relating our movements and disposition to personality profiles, to enable prediction of behaviours and actions. I think UAVs will play a role in building these pictures and responding to them.”

But on the battlefield, at least in cases of peer-to-peer confrontations, Dear cites the work of military historian Azar Gat to show that arms races between states have normally ended in stalemate, rather than led to a decisive breakthrough for one side or the other. Consequently, he doubts whether drone surveillance abilities will create a transparent battlespace in a peer-to-peer fight. Instead, Dear believes that advanced adversaries will find new ways to obscure their movements from one another, much as they do today. He predicts that because camouflage and detection will advance in lockstep, with defence and attack racing to achieve a decisive edge, neither will truly gain any long-term advantage over the other.

However, accelerating UV sensing capabilities are creating a new problem for military planners; a battlespace data deluge the likes of which armed forces have never known before. That could spark a shift towards the use of machine learning to cope with the staggering amount of real-time front-line intelligence that can now be gathered. “We’re seeing that in the US with Project MAVEN, with AI being used in image recognition,” Dear says. “More speculatively, in my personal view, edge computing will see much of this done on the platform (to save bandwidth and speed decision-making), with an aircraft able to retask itself to follow-up algorithmically identified leads. This delegation will be necessary if the number of aircraft and sensors increases at the rate I would anticipate – making it impossible to watch all the feeds all the time.” Removing humans from the decision- making loop is another controversial step that Dear sees coming in response to the drastic increase in information drones can now deliver from sensors (and which will only continue to grow in the future). For now, it is clear that in any cases involving the taking life or the use of force, many militaries still intend to have a human making the final decision on whether to fire or not. But Dear says he doesn’t think this will hold up globally for long.

UV light the way

This is firstly because autonomous UAVs will speed up pace of events within the battlespace. Soon, there simply won’t be time to defer every decision to a human, especially if fighting a militarily and technologically sophisticated adversary. But Dear suspects that automated decision-making will eventually prove more reliable than human decisionmaking in avoiding civilian casualties – less prone to emotion than people, with sensors possessing greater discrimination than the human eye, combined with AI systems that are simply less likely to be affected by motivated reasoning into seeing what they expect rather than what is actually there.

Much as with manned aviation, unmanned aircraft will need to be tailored to task, threat and budget.

“I think in combination, these two developments will see strike authority ultimately delegated to fully autonomous systems by many nations, if not all, in time,” Dear says. “We are not there yet, and the UK is right to show moral leadership, insisting humans must always make the final decision when lethal force is used. But we must also move to understand the issues in all their complexity My view is that others will not show the same caution, though I hope I am wrong. ” This is not the only controversial change that improving UV sensors could bring to modern air forces in combination with other technological developments. In relation to command and control, Dear offers a view that might, in some circles, be seen as quite contentious: in a world of artificially intelligent autonomous unmanned aircraft, sporting highly accurate surveillance technologies, the future of air force command will belong increasingly to engineers, intelligence officers and office-based operations staff rather than front-line aircrew.

In the medium to long-term, Dear argues that once ‘smart’ UAVs are operating with the autonomy to act independently, this means that it is going to be increasingly hard to justify privileging aircrew for aircraft squadron and more senior operational command, as well as for promotion.

Mission planners with the understanding of the battlespace, the enemy, the population and the technologies involved will be considered – even if such individuals have never flown into air combat themselves.

Most of the factors listed above are things that intelligence officers engage with much earlier than aircrew. Meanwhile, the technology issue is best understood by airmen and women with a communications engineering background rather than a piloting one. As air crews increasingly become unmanned vehicle operators, and those systems become increasingly autonomous, Dear foresees command decision-making will eventually be federated between those with engineering expertise, dedicated mission planners who may never have flown, as well as intelligence staff. However, he fears that a shift to embrace these advancements will take longer to bring into effect in militaries than the introduction of such new forces onto the battlefield.

“I worry that in air forces run by pilots there is a strong institutional resistance to the idea that manned aviation will one day be redundant,” Dear says. “That limits the degree to which we pursue and advocate for unmanned solutions to future challenges. The mantra that the future belongs to human-machine teaming and a manned/unmanned mix of combat aircraft might be true, but mustn’t become a shibboleth. We need to be getting beyond this – asking when we think manned aviation will cease to be a majority of our combat aircraft, and starting to plan for this future – or being very clear on why we think such a future is not possible or at least not worth planning for yet.”