The three vehicles were on a route designed for aid vehicles. The Israeli government was “aware of their itinerary, route, and humanitarian mission.”
One had the World Central Kitchen (WCK) logo on its roof.
Yet one or more Israel Defense Forces (IDF) drones, operated by “artificial intelligence”1 software, targeted them, One By One.
Munition dropped right through the World Central Kitchen logo pic.twitter.com/RKcT01Z7SJ
— John Hudson (@John_Hudson) April 2, 2024
According to WCK chef José Andrés, survivors of the first strike made their way to the second vehicle. Israel dropped a precision bomb on that vehicle, one armored with the WCK logo on its roof. Survivors of that strike joined the third vehicle. The IDF bombed it as well. The scope of the horror? About a mile from the first to last bombed vehicle.
Seven died due to a deliberate and “[systematic] car-by-car” attack.
Prime Minister Benjamin Netanyahu called it an “accident” and “unintentional.” He lied.
Haaretz reported Tuesday that the IDF targeted the WCK vehicles because they believed that “an armed man thought to be a terrorist” was in one of them. However, “the target” had not left the warehouse with the cars.
[T]he war room of the unit responsible for security of the route that the convoy travelled identified an armed man on the truck and suspected that he was a terrorist.
Was that bad intel from humans or automated algorithms? The IDF has been using Lavender, a “previously undisclosed AI-powered database” that had, “at one stage, identified 37,000 potential targets based on their apparent links to Hamas.”
A senior officer who had okayed strikes identified by Lavendar told +972 journalist Yuval Abraham:
“I would invest 20 seconds for each target at this stage, and do dozens of them every day. I had zero added-value as a human, apart from being a stamp of approval. It saved a lot of time.”
“Saved a lot of time” while killing indiscriminately. The IDF denies the charges.
At what point do we shift to the more accurate verb, murder, or noun, assassination?
Late last year, University of Southampton professor Christian Enemark made the case for “restrain[ing] the use of armed, uninhabited aerial vehicles (commonly known as ‘drones’).”
Rather than indiscriminate use such as that deployed by the IDF, Enemark argues that “an armed drone should only be used to protect a person or persons facing an immediate threat of serious harm.”
..
There should be “meaningful human control” he said. Not a cursory 20-second review.
Last year, the Royal Air Force (UK) began “pioneering mini-helicopter drones that can fire missiles at targets four miles away.”
Attorney Khalil Dewan asks: “how legal and ethical is it to kill suspected combatants instead of capturing them and providing a fair trial” especially when there is no official declaration of war.
Remember in 2021? A US drone strike in Kabul killed 10 civilians. The U.S. caught hell for that, domestically and abroad.
Israel has killed multiple thousands of civilians over the past six months. According to Briana Rosen at Just Security, “Israel attacked roughly 25,000 targets [in the first two months], more than four times as many as in previous wars in Gaza.”
As Rosen has pointed out, “no single person is going to fully understand how this technology works.” She reported last October that “several [countries] and the International Committee of the Red Cross (ICRC) have proposed banning weapons systems that lack meaningful human control and are too complex to understand or explain.”
Another reminder: is a 20-second review “meaningful”? I don’t think so.
This week, Human Rights Watch (HRW) has described as “unlawful” an early IDF attack on a residential building in Central Gaza that killed 106 people.
The New York-based group said it found no evidence of a military target in the vicinity of the building when the attack took place, which according to HRW made the air attack “unlawfully indiscriminate”… HRW said the Israeli authorities have not publicly provided any information about the attack, “including the intended target and any precautions to minimise harm to civilians.”
When more children are killed in four months of “AI” driven warfare than in all wars around the world combined over a four-year period … we have waded deep into a space better described as “murder” and “assassination.”
Until then, WCK has “asked the governments of Australia, Canada, the United States of America, Poland, and the United Kingdom to join us in demanding an independent, third-party investigation into these attacks.”
Featured image: Israel Defense Forces.
- “Artificial intelligence” has no generally accepted definition. According to Dr. Emily Bender, professor at the University of Washington, it is a marketing term from the 1950s. I suggest my students refer to the systems as automated algorithms (software) or algorithmic automation (hardware). They are not sentient, have no agency.
Talk to me: BlueSky | Facebook | Mastodon | Twitter
Known for gnawing at complex questions like a terrier with a bone. Digital evangelist, writer, teacher. Transplanted Southerner; teach newbies to ride motorcycles. @kegill (Twitter and Mastodon.social); wiredpen.com