Israel’s intelligence services have been discovered to be using an artificial intelligence system dubbed “Lavender” to identify and locate Palestinians suspected of being active members of Hamas’ military wing and the Palestinian Islamic Jihad (PIJ). In testimonies given to journalist Yuval Abraham for a report published in +972 Magazine, six Israeli intelligence officers with experience using the Lavender system revealed how personnel were reduced to mere “rubber stamp[s]”. Israel’s unscrupulously systemised use of technology removes individual complicity in the excess death caused by such systems of assassination. It provides a ready scapegoat for the accusations of breaches of international law, and genocide.
The Lavender system operates by identifying potential targets associated with Hamas or the PIJ based on intelligence data and puts them on kill lists that are given to the Israeli army. In the initial stages of the war, the system identified 37,000 potential targets. According to the +972 report, this system differs from another artificial intelligence system previously used by the Israeli armed forces named “The Gospel”, which identified buildings suspected of being used by militants rather than individuals.
This system of targeted assassinations is justified through the designation of “human targets” which, before the current conflict in Gaza, was limited to high-ranking officials of militant groups such as Hamas. Since 7 October 2023, the Israeli army has expanded this definition to include all operatives of Hamas’ military wing. The massively increased list of potential targets meant that personnel had to be sidelined so that the Lavender system could handle the increased workload, despite it being known to make “errors” in 10% of cases and targeting people with only a loose connection to militancy.
A Lavender programme called “Where’s Daddy?” would track selected targets and notify the army when the target returned home, at which time the army would carry out an air strike against the target’s home. These strikes would often be carried out at night whilst the target family was present. For lower-ranking targets, “dumb bombs” were utilised to reduce expenditure on precision ordnance. A Guardian article cited experts who believe this could be one of the reasons why death tolls have been so high in Gaza.
+972’s sources stated that after the system had been approved by Israeli officials, just two weeks after 7 October, the system was given clearance to be used widely across the armed forces. The system should then be received by Israeli soldiers as the equivalent of an official order.
The use of artificial intelligence as an indiscriminate tool of surveillance and a producer of mass casualties is a frightening development in warfare. Israel’s significant advantage from the use of AI, especially considering the International Court of Justice’s (ICJ) genocide accusations, is as a scapegoat. Under the official context of relieving a data bottleneck, the Lavender system makes it harder to identify the “dolus specialis” (or “special intent”) required to prove genocide. Suddenly, the thousands of innocent casualties can be attributed to individual misconduct or as a result of the system’s failure, rather than an intentional attempt to murder civilians.
We in the West will not stay sheltered from the type of suffering in Gaza. AI is already being used in the war in Ukraine under US tech firm Palantir. Israel has been accused of using Gaza as a laboratory to test newly developed arms. Instead of the IDF using AI against Palestinians, it could be the Metropolitan Police using AI against Palestinian solidarity rallies that are so often labelled as “hate mobs”. This is the dystopian reality come true.