Amsterdam Law School
8 februari 2024
The deployment of AI on the battlefield has taken off: For example, Israel uses an ‘iron dome’ to intercept missile attacks. 'You can't do that by hand. You need the lightning-fast reaction times of computers for that’ Kwik explains. And on the war front in Eastern Europe, drones also play a major role in carrying out attacks.
This relatively new technology creates problems in applying International humanitarian law. 'Lawyers often have bad understanding of wat AI is’, he observes. 'Many lawyers who need to say how law should be applied do not themselves have a background in technology and artificial intelligence. That causes misunderstandings. In the first year of my dissertation, I therefore examined purely the technical part and took first-year computer engineering classes.'
'There is, for example, already a system for distinguishing civilians from military personnel. Is someone wearing a rifle and a uniform? This is all recorded. But with a specific combination of certain headgear with a certain belt, AI may well draw the conclusion that a civilian is a soldier. That could lead to a civilian being shot. How should a battlefield commander deal with such systems? And when is he or is he not responsible for the consequences?’
'Many people say that no one is responsible if the system makes a mistake. But one of my assertions is that the commander usually remains responsible. After all, he or she decides whether or not to deploy a system and in what way. It's just that a lot has to precede that. You can't expect military personnel to understand technology completely. But if you use a weapon, you have to know enough about the underlying technology to make informed choices. How accurate is a system? How do you analyze this? In what conditions can mistakes be made? Has the system also been tested in the snow? In addition, you need to look at your opponent's technical actions. Viruses can be used to disrupt your system. Or consider a modern form of deception: if, for example, a cloth with a certain pattern is hung on an apartment building that makes the system think it is a military building.'
If you use a weapon, you have to know enough about the underlying technology to make informed choices
'The most important thing is that the commander has enough knowledge, which can also be provided through a technical advisor. They already have a legal advisor and a policy advisor. I foresee that there will also be a technical advisor who can explain to the commander what the risks of the systems are in certain circumstances and what settings you can change. For the military, the most important thing is to win the war. Soldiers don't have to figure out the technical part all by themselves. But I think the military organization should require the commander to at least have access to technical knowledge. That explicit obligation doesn’t exist now in the law of war.’
'It's complicated to answer that because a lot of that information is classified. I suspect there is already some training. An army has no interest in developing weapons that commanders cannot master well. But in doing so, it must also be clear what a commander's duties and responsibilities are.'
'The beauty of International humanitarian law is that it is written in a way that is relatively future proof. It's fairly abstract so you can also apply it to new technology. There's no need to rewrite that law. But applying that law to new technology can be tricky. A new handbook for cyber attacks has already been published recently. My dissertation offers a foundation for a new military handbook in the field of autonomous weapons systems.'