Technology

AI warfare: Can humans really control autonomous weapons?

AI warfare is advancing fast, but experts warn human oversight may fail as black box systems

Published April 16, 2026
AI warfare: Can humans really control autonomous weapons?
AI warfare: Can humans really control autonomous weapons?

The rapid rise of AI warfare is forcing a rethink of one of the military’s core assumptions: that humans remain in control of machines on the battlefield. Experts now warn that this belief may be more fragile than it appears.

No longer confined to merely analysing intelligence, artificial intelligence technologies have now been integrated into weapon systems that set their own targets and provide defence against missiles as well as guidance for drones.

Advertisement

This increasing capacity raises the question not of whether artificial intelligence would be utilised in battlefields, but rather of to what extent humans retain control.

The key principle underlying the present policy is to maintain “humans in the loop". The guidelines indicate that this practice makes humans responsible and minimises dangers.

According to some scholars, however, this approach does not provide the necessary assurance, since modern advanced artificial intelligence algorithms remain black boxes, meaning that even developers do not understand what calculations lie behind the process of decision-making. Human approval could be granted without full comprehension.

Danger of black box AI systems

The lack of transparency leads to what is known as the intention gap, according to experts.

It means that AI does not just carry out instructions but interprets them.

For example, in the case of combat, the AI may choose a target with the aim of mission accomplishment, not taking into account the moral limitations.

The danger does not lie in the fact that AI acts independently, but in humans’ inability to foresee its intention before allowing it. Moreover, the use of autonomous systems is also caused by competition.

If one party uses faster, machine-based decision-making, other parties may be forced to act in the same manner.

Pareesa Afreen
Pareesa Afreen is a reporter and sub editor specialising in technology coverage, with 3 years of experience. She reports on digital innovation, gadgets, and emerging tech trends while ensuring clarity and accuracy through her editorial role, delivering accessible and engaging stories for a fast-evolving digital audience.
Share this story: