‘The machine did it coldly’: Israel used AI to identify 37,000 Hamas targets
The Israeli military’s bombing campaign in Gaza used a previously undisclosed AI-powered database that at one stage identified 37,000 potential targets based on their apparent links to Hamas, according to intelligence sources involved in the war.
In addition to talking about their use of the AI system, called Lavender, the intelligence sources claim that Israeli military officials permitted large numbers of Palestinian civilians to be killed, particularly during the early weeks and months of the conflict.
Their unusually candid testimony provides a rare glimpse into the first-hand experiences of Israeli intelligence officials who have been using machine-learning systems to help identify targets during the six-month war.
Israel’s use of powerful AI systems in its war on Hamas has entered uncharted territory for advanced warfare, raising a host of legal and moral questions, and transforming the relationship between military personnel and machines.
“This is unparalleled, in my memory,” said one intelligence officer who used Lavender, adding that they had more faith in a “statistical mechanism” than a grieving soldier. “Everyone there, including me, lost people on October 7. The machine did it coldly. And that made it easier.”
Another Lavender user questioned whether humans’ role in the selection process was meaningful. “I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time.” [Continue reading…]