The United States reportedly budgeted close to $18-billion in the period 2016 to 2020 for the research, design and development of autonomous weapon systems. (Noorullah Shirzada/AFP via Getty Images)
The 2021 movie Finch featuring Tom Hanks, who builds a robot, teaches it to be human and then eventually become friends with it, is arguably a less terrifying representation of artificial intelligence than the 1984 cyborg assassin we know as the Terminator sent from a post-apocalyptic future (2029) to kill.
But just how far are we from having killer robots — lethal autonomous weapons — being used in modern society? Or have we already crossed that threshold?
“Weapons that are extremely close to fully autonomous — guided bombs, all kinds of missiles — it is safe to say that we are dangerously close to that,” says Bruce Watson, chair of artificial intelligence for cybersecurity at the School for Data-Science and Computational Thinking at Stellenbosch University.
Watson differentiates between classes of autonomous weapons.
“There are weapons that are in the physical world — tanks, aircrafts, boats and missiles — and there are also weapons that are operating in cyberspace and that’s highly digital, so you can’t really visualise them in the same way.”
Semi-autonomous weapons need a “human in the loop” to take the final step. Fully autonomous ones, on the other hand, take it up a notch by removing human assistance; they set the target, aim and take action on their own.
Cyberspace vs autonomous weapons
“Autonomous weapons are more destructive in classical warfare, cyber weapons kind of fly under the radar, as autonomous weapons are definitely more impactful right now,” says Watson.
Autonomous weapons such as a guided bomb or missiles can be detected and taken down by the receiver before hitting a target. This type of warfare can physically be seen or tracked and is specific to a certain location. This means one can be aware of war in a country, while being physically distanced from it.
There are instances where these types of highly artificial intelligent weapons are used in conflict situations. But pinpointing this might be difficult because “tracks are often covered up very well”, says Watson, because no one wants to be caught violating international moral or ethical norms.
He does point out one system categorised as fully autonomous which has been overlooked for many years: the close-in weapon system, or CIWS. These weapons include guns and are placed on military ships.Without the assistance of a human, they detect and destroy incoming missiles, aircrafts and bombs.
The CIWS can fail, notes Watson, such as the recent incident when a Russian missile cruiser, the Moskva, sank in the Black Sea in April. Despite claiming an on-board fire, it is believed that a Ukrainian missile hit the warship, which should have been protected by its close-in weapon system.
Where you can remove yourself from a war in another country, in cyberspace the distance between one city to another is basically the same.
“Once you’re exposed in cyberspace, it doesn’t matter how far you go on the planet. If you are connected in any way via the Internet or to some other kind of network, then you’re exposed,” Watson explains.
“It’s even worse in the sense that things travel very quickly in cyberspace. In normal warfare, we can see something coming. It takes a while for another country to prepare. Those things can be detected. In cyberspace preparations are largely invisible.”
Cyberspace also has a low entry barrier — everyone from teenagers to companies and law enforcement has access to it. You don’t need a fleet of tanks or fighter jets, only a cloud service, which makes it cost-effective as well.
South Africa has, until now, been relatively “safe” in terms of cyber attacks. But last July, state-owned rail, port and pipeline company Transnet said its port terminals had experienced “an act of cyber-attack, security intrusion and sabotage, which resulted in the disruption of TPT [Transnet Port Terminals] normal processes and functions or the destruction or damage of equipment or information”.
These kinds of attacks are disruptive, Watson says, adding that cyber wars and cyber weapons have been limited to the disruption of businesses and financial costs.
“It perhaps won’t always stay like that. Keep in mind that there will be increased numbers of cyber attacks that affect infrastructure, power supplies, communications but even affect hospitals and then we could actually see deaths when various diagnostic machines are taken offline,” he adds.
He says South Africa is not where it should be in terms of cybersecurity, especially “if you’re looking at how digital our society is, we should be more effective, we should have more capacity and capability and we are simply not doing enough”. Factors that have impeded development include the government not investing more in cyber research as well as experts leaving the country.
In the event of a war, whether in cyberspace or with fully autonomous weapons — what ethical standards are there to guide countries in terms of the development of artificial intelligence?
Globally, countries are putting large sums of money into autonomous weapons. The United States reportedly budgeted close to $18-billion in the period 2016 to 2020 for the research, design and development of autonomous weapon systems.
International bodies such as the United Nations and Nato have backed bans on autonomous weapons.
Artificial intelligence (AI) ethics researcher Emma Ruttkamp-Bloem, who heads the philosophy department at the University of Pretoria and is the leader of the ethics of artificial intelligence research group at the Centre for Artificial Intelligence Research (CAIR), says a lot of work is being done not only by governments but also by private and public organisations to regulate autonomous weapons.
“However, the big sums invested in these kinds of weapons and the vast strategic power that they promise, make meaningful debate very hard to establish,” she says.
Ruttkamp-Bloem also points out that Russia, India, Israel and the US “were the big spanners in the wheel” during the UN Convention on Certain Conventional Weapons in December 2021. South Africa was one of 40 countries that called for new international law to ban and restrict autonomous weapons systems.
Ruttkamp-Bloem cites the 11 principles of the UN Group of Governmental Experts that constrain the development and deployment of lethal autonomous weapon systems (LAWS).
“These principles focus mainly on establishing that the research, design, development, deployment and use of LAWS must comply with international humanitarian law,” says Ruttkamp-Bloem. “And that humans are accountable at all levels for these weapons.
“Thus, on the whole, the message is that there should be a goal of finding a balance between what is necessary from a military perspective and humanitarian and ethical concerns.”
Is a killer robot apocalypse possible, where robots become more dominant than humans? Watson says this is something the AI community is divided on.
The more progress is made and systems are built that learn on their own or start fishing up information from the internet and other sources, the greater the possibility of systems thinking and acting for themselves.
Watson explains: “The more we work on algorithms that do that and the more computing power we allow for this kind of thing for AI systems, they’re going to reach a point where they are generating knowledge at a pace that outstrips what a human can do. And they’re able to identify problems, real problems or imaginary problems, that they then perceive need solving one way or the other.”
But humans can maintain control because there are “ample collections of places that we humans can keep ourselves as an integral part of the systems, and be able to switch them off if we ever need to”.
We are at the point where AI operates a 3D-printer, helps run an electricity system or power station, operates on patients and drives our vehicles.
“So we are specifically allowing these things into all of these arenas. And we do need to keep an eye in the long term that we don’t lose control,” says Watson.
[/membership]