Israel Lets AI Decide Who Dies in Gaza
April 12th, 2024Via: The Libertarian Institute:
The Israeli military has employed yet another AI-based system to select bombing targets in the Gaza Strip, an investigation by +972 Magazine has revealed. The new system has generated sweeping kill lists condemning tens of thousands of Palestinians, part of the IDF’s growing dependence on AI to plan lethal strikes.
Citing six Israeli intelligence officers, the Tel Aviv-based magazine said the previously undisclosed AI system, dubbed ‘Lavender,’ has played a “central role in the unprecedented bombing” of Gaza since last October, with the military effectively treating its output “as if it were a human decision.”
“Formally, the Lavender system is designed to mark all suspected operatives in the military wings of Hamas and Palestinian Islamic Jihad (PIJ), including low-ranking ones, as potential bombing targets,” the outlet reported, adding that “during the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants—and their homes—for possible air strikes.”
However, while thousands have been killed in the resulting air raids, the majority were “women and children or people who were not involved in the fighting,” the officers told the magazine, noting that Israeli field commanders often rely on the AI system without consulting more substantial intelligence.
This reminds me of Pontius Pilate saying, “Don’t blame me. The local people did it.”