Tel Aviv [Israel]: Amid the war against Hamas in Gaza, the Israel military has been using artificial intelligence to help identify bombing targets in the war-torn regions, CNN reported citing an investigation by +972 Magazine and a local call with inputs from six Israeli intelligence officials involved in the alleged program.
The Israeli officials, quoted in an extensive investigation by the online publication jointly run by Palestinians and Israelis, said that the AI-based tool was called “Lavender” and was known to have a 10 per cent error rate.
When asked about +972 Magazine’s report, Israel Defence Forces (IDF) did not dispute the existence of the tool but denied AI was being used to identify suspected terrorists. But in a lengthy statement it emphasized that “information systems are merely tools for analysts in the target identification process,” and that Israel tries to “reduce harm to civilians to the extent feasible in the operational circumstances ruling at the time of the strike.”
CNN also quoted the Israel Defence Forces (IDF) and said “analysts must conduct independent examinations, in which they verify that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives.”
However, one official told +972 “that human personnel often served only as a “rubber stamp” for the machine’s decisions” and typically devoted only around 20 seconds to each target – ensuring they are male – before authorizing a bombing.
The investigation comes amid intensifying international scrutiny of Israel’s military campaign after targeted air strikes killed several foreign aid workers delivering food in the Palestinian enclave. Israel’s siege of Gaza has killed more than 32,916 people, according to the Gaza Ministry of Health, and has led to a spiralling humanitarian crisis where nearly three-quarters of the population in northern Gaza are suffering from catastrophic levels of hunger, according to a United Nations-backed report.
The investigation’s author, Yuval Abraham, previously told CNN in January of his work looking into how the Israeli military has been “heavily relying on artificial intelligence to generate targets for such assassinations with very little human supervision.”
The Israeli military “does not use an artificial intelligence system that identifies terrorist operatives or tries to predict whether a person is a terrorist,” the IDF statement on Wednesday said. But its analysts use a “database whose purpose is to cross-reference intelligence sources, in order to produce up-to-date layers of information on the military operatives of terrorist organizations.”
Human officers are then responsible for verifying “that the identified targets meet the relevant definitions in accordance with international law and additional restrictions stipulated in the IDF directives,” according to the IDF statement, a process also described by +972.