Over 50,000 people eagerly picked up Microsoft’s HoloLens augmented reality headset. The product was originally designed for gamers, technicians and doctors, but those users could not know that their $3,000 investment, and the time they spent giving feedback and improving the HoloLens was for the benefit of the military.

Those HoloLens users, and many Microsoft employees, were shocked to learn Microsoft sold the technology to the Pentagon for a project intended for “increasing lethality” on the battlefield.

HoloLens is one of many examples of "dual use" civilian technology that may be repurposed as a tool for battlefield use. Consumers and employees have a right to know where each company stands on the use of their technology for military purposes, especially when it comes to AI.

Lethal autonomous weapons, popularly known as killer robots, could select and attack targets without a human operator’s direct input. Precursor weapons that show the trend to increasing autonomy exist today: one example of such a weapon is the IAI Harop drone.

Autonomous weapons would violate fundamental ethical and legal principles by removing human control from life and death decisions. Today’s software does not begin to have the sophistication required to follow the laws of war and armed conflict.

Small numbers of people could control very large forces of autonomous weapons. They are a new potential weapon of mass destruction, weakening the essential checks and balances required in the exercise of military power. There is a risk of a new global arms race in the development of autonomous and AI-driven weapons.

These realities make a new survey of 50 technology companies by PAX vitally important. ‘Don’t Be Evil?’ shows that many massive companies like Amazon, Microsoft and Palantir have no comprehensive policy to prevent their technology’s use in killer robots. Many of the profiled companies, known for technology with civilian applications, are increasingly selling their tech to the military.

I was employed by one such tech giant that - I later learned - signed secret contracts with the US Department of Defense as part of Project Maven, an AI-driven military surveillance project. Because I hold strong ethical objections to my work being repurposed for military projects that form part of the ‘kill chain’ of target identification, force dispatch, and destruction, I felt compelled to leave my career at Google.

Objections like mine led Google to cancel the project and establish one of the only explicit guidelines in big tech that prohibits the development of weapons.

Pressure faced by other companies shows that transparency, diligence and action by employees and consumers when necessary can stop the flight toward lethal autonomous weapons.

Technology companies should ensure that they know exactly how and in what context their technology is used, particularly where military contracts and dual-use technology such as object recognition are concerned. Strong contractual stipulations and auditing as mandatory conditions are the only way to guarantee that algorithms will not end up on the drawing board of a killer robot architect. Finally, technology companies must ensure that employees are informed when they are working on technology for military applications.

Tech workers are increasingly taking a stand against the weaponization of dual-use technologies. We are asking for increased transparency and clarity on the intended uses of the technology we develop. No one should be forced to contribute to “increasing lethality”, especially unknowingly, and yet today tech workers face this scenario, and consumers may be unwittingly doing so as well when they interact with their products. Now is the time for clarity and transparency from the tech sector.

Laura Nolan has been a software engineer in industry for over 15 years, including over five years working for Google. She was one of the (many) signatories of the “cancel Maven” open letter and campaigned within Google against Project Maven, before leaving the company in protest. In 2018, Nolan began campaigning for the Campaign to Stop Killer Robots, and founded TechWontBuildIt Ireland