The implementation of AI technologies in military operations and surveillance systems is raising significant legal issues in international law. Analyzing the use of AI enabled systems like ‘Lavender,’ ‘Gospel,’ and ‘Where’s Daddy’ for targeting operations in Gaza raises unique legal issues with established legal frameworks and human rights protections.
These AI systems that are used to identify targets based on surveillance data function in the context of what Palestinian scholar Helga Tawil-Souri has called ‘digital occupation.’ The systems analyze hundreds of data sources, including Google Photos facial recognition, WhatsApp group memberships, and cellular information to produce targeting recommendations that Israeli forces can approve in 20 seconds. This rapid decision-making process raises significant legal concerns under International Humanitarian Law (IHL), especially regarding the distinction between civilians and combatants.
The provision of Article 48 of Additional Protocol I (API) of the Geneva Conventions is against military forces using weapons that can hardly distinguish between civilians and combatants. Based on the 20-second review period for targeting decisions, it can be assumed that the duty to take all feasible precautions has not been fulfilled in establishing whether the targets are lawful military objectives. Israeli intelligence officers have admitted to +972 Magazine that “nothing happens by accident,” and they “know exactly how much collateral damage there is in every home” in Jerusalem. This admission only strengthens the potential legal arguments for violations of the principle of proportionality under Article 51(5)(b) of API since the anticipated military advantage is compared with the likely civilian harm.
The involvement of major technology companies like Amazon Web Services, Google, and Microsoft through Project Nimbus raises significant questions under the UN Guiding Principles on Business and Human Rights. The $1.2 billion cloud technology contract signed in April 2021 has enabled the storage and processing of surveillance data used in military operations. Similarly, Palantir’s “strategic partnership” with Israel for AI systems in “war-related missions” potentially implicates these companies in violations of international law.
The surveillance infrastructure extends beyond Gaza. AI-enabled control systems are ubiquitous in the occupied West Bank, particularly in al-Khalil (Hebron). The Blue Wolf smartphone technology and larger Wolf Pack database collect and process Palestinians’ personal information, including biometric data, without consent or specified intent; according to Amnesty International, this is broad data capture. This systematic surveillance likely violates Article 17 of the International Covenant on Civil and Political Rights (ICCPR), which protects against arbitrary interference with privacy. The ICCPR is binding on Israel, and Israel is legally bound by the treaties that it has ratified. International Covenant on Civil and Political Rights (ICCPR), arts. 2(1), 26, adopted December 16, 1966, G.A. Res. 2200A (XXI), 999 U.N.T.S. 171 (entered into force March 23, 1976, and ratified by Israel October 3, 1991)
In Occupied Jerusalem, the “Mabat 2000” video surveillance system, equipped with facial recognition technology, exemplifies settler colonial technologies. The density of surveillance – with CCTV cameras every five meters in some areas – creates a form of electronic siege. This comprehensive surveillance system, combined with social media monitoring by the Israeli Cyber Unit, raises serious concerns under international data protection standards and human rights law. In 2024, the Cyber Unit launched a terror attack on Lebanese Civilians, detonating thousands of pagers remotely, killing several civilians. Amnesty International has called on global legal forces to investigate this as a war crime. Notably, while Israel ratified the ICCPR in 1991 and is legally bound by its provisions, enforcement remains challenging. The UN Human Rights Committee, which monitors ICCPR implementation, can only issue non-binding recommendations. However, these recommendations carry significant diplomatic weight and can influence state behavior.
Legal remedies may be pursued through multiple channels, though jurisdictional complexities exist. While Israel is not a party to the Rome Statute and thus not a member of the International Criminal Court (ICC), the ICC has jurisdiction over alleged crimes committed in the territory of Palestine since it became a State Party in 2015. In 2021, the ICC Prosecutor confirmed that the Court’s territorial jurisdiction extends to Gaza and the West Bank, including East Jerusalem. Therefore, the ICC could investigate the use of AI systems in targeting civilians as potential evidence of war crimes, even without Israel’s consent. Corporate executives and military officials could face individual criminal responsibility. Additionally, national courts could exercise universal jurisdiction, while civil litigation against technology companies could be pursued in their home jurisdictions.
Documentation of these systems’ operations and their impacts becomes crucial for any legal proceedings. This includes technical evidence about system architecture and decision-making processes, as well as documentation of civilian casualties and discriminatory targeting patterns. The systematic nature of the surveillance could support findings of human rights violations in regional human rights mechanisms.
The way forward includes enhanced legal frameworks to deal with AI in military contexts, more accountability mechanisms for companies, and effective export controls for surveillance technologies. More independent monitoring and requirements for transparent reporting of AI system use in conflict will be crucial starting points in enforcing international law and protecting human rights in this era of increasing automated conflict.
This reach of international law is further complicated by jurisdictional challenges posed by the involvement of multinationals: while companies such as Amazon, Google, and Palantir are headquartered in the United States, their actions through contracts like Project Nimbus might expose them to jurisdiction in many venues. It includes the home jurisdiction under domestic laws, countries where their technology is deployed, and possibly international forums. For example, the U.S. Alien Tort Statute, although narrowed by recent decisions of the Supreme Court, may still provide an avenue for civil litigation against such corporations in case of serious violations of international law.
Ahmad Ibsais is an Associate Editor on the Michigan Technology Law Review.