IAI Eitan Israeli Drone
A Heron TP, also known as the IAI Eitan surveillance unmanned air vehicle, flies during an official inauguration ceremony at Tel Nof Air Force Base near Tel Aviv Feb. 21, 2010. Reuters/Gil Cohen Magen

JERUSALEM -- Last November, when Israel launched Operation Pillar of Defense to stop rockets fired by Palestinian militants, it did not put a single boot on the ground in Gaza. It flew drones instead.

The unmanned aerial vehicles, or UAVs, initially developed for surveillance purposes, played a major offensive part during the eight-day campaign consisting of air and sea strikes. The remotely controlled aircraft fired weapons on military targets, in addition to those from Israel’s jet fighters and gunboats.

With no Israeli soldiers in the streets, unlike during the previous military campaign four years earlier, the buzz of the flying drones was a constant nightmare for people in Gaza. Around the world, that buzzing noise is not going away. Drones as war machines are here to stay, and Israel, together with the United States, is at the forefront of armed UAV development and use. In the last decade, U.S. drones have been sent on deadly missions to fight the war on terror worldwide, from Pakistan to Africa -- and that’s only one aspect of the new, nonconventional way of fighting conflicts.

Since 9/11, warfare has dramatically been transformed from armies on battlefields to an unconventional array of cyberwarfare, international military campaigns against rogue states and terrorist militias, and unmanned systems. War is becoming more asymmetrical. Technology has to catch up with the new reality of fighting.

“During operation Pillar of Defense, we were able to intercept simultaneously as many as 15 rockets fired from Gaza, thanks to Iron Dome,” said former Israeli Army Brigadier General Daniel Gold, who heads the Israeli National Committee for Commercial/Civilian Cyber R&D. Considered the father of the Iron Dome, the missile defense system put into operation in 2011 to protect Israel from rockets and artillery shells fired from Gaza and Lebanon, Gold spoke during a panel on “Tomorrow’s Wars” at the Israeli Presidential Conference in Jerusalem last week. Other panel members were two prominent international military affairs commentators, Edward Luttwak, a senior associate of the Center for Strategic and International Studies in Washington, and Michael Walzer, a professor emeritus at Princeton University’s School of Social Science.

Anti-missile systems -- Iron Dome’s third version is under development -- and military drones are two areas in which Israel excels. In 2010, Israel Aerospace Industries, or IAI, delivered to the Israeli Air Force the new Heron TP, an armed drone with a 26 meter (85 foot) wingspan capable of remaining aloft for 36 hours. At the other end of the size spectrum, IAI also makes micro and hand-launched nano-UAVs, which support troops in hostile environments.

Israel and the U.S. are by no means alone at the cutting edge of this military revolution. In dozens of other countries, including Brazil and China, private and state-owned companies have embraced commercially promising technologies; UAVs can be employed for civilian purposes as well, from rescuing people after natural disasters to tackling wildfires.

But war hasn’t gone completely high-tech yet. Again, Israel and America are good examples. “In today’s wars, like in Afghanistan or when Israel fought the Lebanese militia Hezbollah in 2006, high technology is only the opening of the show. You always have to resort to medium tech later and to use eventually special units or local fighters armed with rifles and mortars. Ninety percent of military budgets is still spent on conventional arsenals, while the rest goes to new weaponry developed by private companies,” Luttwak said.

That percentage is changing, with the development of new unmanned weapons that can select and engage targets without further human intervention. Lethal autonomous robots, or LARs, as they are known, offer higher force projection, i.e. preserve the lives of one’s own soldiers, require fewer military personnel, offer much greater precision and accuracy and do not act out of anger or fear as humans might.

Yet artificial intelligence and sophisticated sensors cannot fully replace humans, especially when it comes to the LARs’ compliance with humanitarian and human rights laws established by several international treaties and conventions since the end of World War II.

A report published last April by Christof Heyns, the United Nations’ special rapporteur on extrajudicial, summary or arbitrary executions, noted that LARs do not have the qualitative judgment skills to comply with the rules of “proportionality” and “discretionality.” They are unable, for instance, to distinguish a civilian from an enemy combatant or to use common sense to evaluate when the use of force may not be necessary.

Who should be held responsible for loss of life when LARs are used -- a moral as much as a legal question -- is another dilemma posed in the U.N. report.

“When weapons become something autonomous, this kind of warfare needs standards, because sooner or later other actors beyond the U.S. and Israel will have these devices. Then we may be under attack, and we might want to condemn it by what we would do and what we wouldn’t do with such weapons,” Walzer said.

“But even before this scenario, prudence and right judgment are necessary, because public opinion may be shaped by how we use a technological advantage,” added Walzer, who has published extensively about the morality and legitimacy of wars.

Some argue that excessive asymmetry in a conflict, one that would be increased by LARs, may trigger angry reactions from the weaker side, which could retaliate against civilians. Already, such justifications are being used by some terrorist organizations. For example, Hamas, the Islamic movement ruling in Gaza, and many among the local population have often justified rockets fired at Israeli cities as the only way to balance Israeli airstrikes over the Strip.

“The main question remains a technical one: Is it possible or will it one day be possible to develop a program, which enables a robot to distinguish not too stupidly between, on the one hand, legitimate targets, military objectives, combatants [...] and on the other civilians? ... Pending evidence of revolutionary technical developments, it may be wise to limit the use of automated weapons to situations in which no proportionality assessment is needed,” wrote Marco Sassoli, a professor of international law at the University of Geneva, in a recent paper titled “Autonomous Weapons: Potential Advantages for the Respect of International Humanitarian Law.”

Literature and cinema have tried many times to foreshadow the challenges raised by technology and provide answers. In his science fiction works, Russian-born American novelist Isaac Asimov postulated the three laws of robotics (I. “A robot may not injure a human being”; II. “A robot must obey the orders given to it by human beings, except where such orders would conflict with the first law”; III. “A robot must protect its own existence as long as such protection does not conflict with the first or second laws”).

In 1987 Hollywood blockbuster "RoboCop," similar directives were given to the cyborg policeman featured as the movie’s main character. But in later movies of the "RoboCop" franchise, just as in Asimov’s novels, robots end up harming their creators.

Warnings raised by science fiction may be relevant to us today, as drones are already in use for law-enforcement purposes.

“In New Zealand, surveillance drones have assisted in criminal investigations. In one case, aerial imagery captured by drone cameras assisted in collection of sufficient evidence to charge a suspect with a criminal offense and bring that person before a court,” said Ben Clarke, an adjunct law professor at the University of Notre Dame, Australia.

“In the U.S., drones are widely used for border surveillance. In New York, the chief of police has expressed interest in buying drones. ... In other parts of the United States, some legislatures have regulated use of drones, while various law-enforcement authorities across the country have admitted to having or wanting to acquire drones,” Clarke said.

Drones could serve a number of purposes in law enforcement, from crowd control to border patrolling to hostage situations. Israel already uses remotely controlled weapon stations along the border with the Gaza Strip, according to the U.N. Office for the Coordination of Humanitarian Affairs, or OCHA, in the Palestinian Territory.

“You can't predict how far technology will advance. Robots that function as pack mules may one day become standard army kit once they can dodge walls and climb rough terrain. For now, artificial intelligence doesn’t seem to have developed to the point where nuanced decisions that humans can make could be made by robots. On the other hand, use of some proven technology, including armed drones, in law enforcement seems far less likely. Despite their proven capabilities, use of armed drones in regular policing would be a highly controversial development. At this stage, there is little if any appetite among states to go down this path. Secondly, I do not think the public would accept it,” Clarke.

While money and science are likely to get us closer to the future envisioned by writers and film directors, it is up to political decision-makers to regulate technological developments. UAVs and LARs may assist police and military forces in conducting surveillance, or they may evolve into weapons and eventually into warriors. One thing is certain: Pandora’s box is open.