The Future of Warfare Is Happening In Gaza

Gaza is not just a (war-) crime scene, it's an AI laboratory. And it's clarifying that the point of AI-enabled war isn't precision—it's scale. 

The Future of Warfare Is Happening In Gaza
Frank R. Paul for Science & Invention, January 1922. Public domain, via Wikimedia Commons.

Edited by Sam Thielman


A TREMENDOUS PIECE OF JOURNALISM, published in the Israeli outlets +972 and Local Call, and confirmed by The Guardian, sheds crucial light on Israel's war on Gaza—and, for that matter, on wars of the future. 

The distinguishing feature of the two-month old Gaza War is the stunning level of its devastation. The Palestinian Health Ministry estimated on Monday that the death toll approaches 16,000, with another 42,000 wounded. With Israel's post-ceasefire phase of its war targeting southern Gaza, where it earlier told the 1.9 million people now displaced by the war to go, "space for the humanitarian response allowed inside Gaza is constantly shrinking," Lynn Hastings, the United Nations' humanitarian coordinator for the Palestinian occupied territories, said Monday. (Israel last week told the U.N. it was revoking Hastings’ visa over her "refusal to speak out against Hamas.") Even Israel's arms dealer is starting to warn that this is how asymmetric wars are ultimately lost. (More on that in the next section.)

Thanks to +972/Local Call's Yuval Abraham, we have an inside look at how all this state violence is systematized. It occurs thanks to the maturation of an AI-enabled program for generating targets called, discomfitingly, Habsora, which means "The Gospel" in Hebrew. Habsora ingests a massive amount of data—the specifics of it remain unclear—and recommends an astonishing number of targets. In an under-noticed interview from earlier this year that Abraham dredged up, the former IDF chief of staff, Gen. Aviv Kochavi, reflected on Habsora's use in the 11-day 2021 Israeli bombing of Gaza. "You see, in the past there were times in Gaza when we would create 50 targets per year," Kochavi said. "And here the machine produced 100 targets in one day." And that was what the machine could do two years ago. It's very unlikely that Israel would accept a post-war United Nations committee of inquiry, but any such inquiry would surely seek to examine the impact of Habsora. 

The lens of Abraham's reporting clarifies not only Israel's war on Gaza, but the emerging age of AI-enabled warfare. "Cutting edge technology is supposed to make warfare more precise," said an al-Jazeera segment on Israeli AI in Gaza, "but the evidence on the ground suggests the opposite may be true." Precision, however, is the wrong way to think about it. AI is not about precision. It's about scale. 


THE DATA THE AI TRAINS ON MAY INDEED BE GRANULAR. Abraham's article reports that Israel has substantial intelligence locating where Hamas fighters, even low-ranking Hamas ones, live. The Guardian follow-up article noted Israel's extensive "drone footage, intercepted communications, surveillance data and information drawn from monitoring the movements and behavior patterns of individuals and large groups." 

But those are inputs for Habsora, not outputs. Designed into this and every AI system are parameters—what, in human terms, we would call presumptions. When it comes to AI's use in warfare, those presumptions include policy choices and military doctrine that help determine what Habsora will consider a legitimate target to generate for the IDF. In this case, those presumptions include concepts like "Power Targets," which Abraham notes are "not distinctly military in nature… private residences as well as public buildings, infrastructure, and high-rise blocks."


And that's to say nothing of intelligence that's either straight-out wrong, since identification mistakes are bound to happen on the front end of intelligence collection, or derived from machine-learning prediction—something capable of being foiled by, say, the human whim to take a different route to the store this morning. "[T]estimonies of Palestinians in Gaza suggest that since October 7, the army has also attacked many private residences where there was no known or apparent member of Hamas or any other militant group residing," Abraham reports. 

Existing alongside the "Power Target" concept for the IDF are decisions about acceptable civilian deaths. Abraham reminds us that those have changed over time. In 2002, Israel reaped outrage when it bombed the home of Hamas military commander Salah Shehadeh, killing Shehadeh, his aide and 13 of his relatives and neighbors, including eight children. Twenty-one years later, Israel considers it acceptable to bomb the residences of even low-level Hamas fighters. Once that choice has been made, Israel is necessarily going to kill scores of Palestinians with no ties to Hamas beyond their misfortune to live in the same apartment building or on the same street as someone who does. Strikes on such targets, in such an environment—one of the most densely packed areas on earth, remember—operate as a "means that allows damage to civil society," one of Abraham's IDF sources said. At that point, Habsora might be better understood as a war-crimes engine.

Abraham writes that his sources, to one degree or another, understood that inflicting such "damage to civil society" is "the real purpose of these attacks." The point of bombing "Power Targets" is to shock Palestinians into "put[ting] pressure on Hamas," one of his sources said. That sounds awfully reminiscent of the U.S. airpower concept of "Shock and Awe" made famous during the Iraq War: inflicting devastation so psychologically overwhelming as to break Iraqis' will to resist. If you remember the Iraq War, you'll remember it didn't exactly turn out that way. The scale promised by military AI is like a combination of Shock And Awe and the CIA's War on Terror "Signature Strikes." Those strikes killed people whose observed "patterns of life" fit analyst-presumed patterns of militant activity, like being a man between the ages of 18 and 50 who carried a gun. The CIA didn't even have to know their names to kill them. 

For the decade or so that theorists and policymakers have been discussing the emergent military applications of AI, the Pentagon has offered up as a measure of reassurance and probity that a human being will always make the decision to use lethal force. (Although the Pentagon also recently eroded this allegedly crucial safeguard.) Habsora proves how weak that reassurance truly is. The generation of the target by the machine all but predetermines the decision to fire. A human eye "will go over the targets before each attack, but it need not spend a lot of time on them," a former Israeli intelligence officer tells Abraham. Another of Abraham's sources, this one with experience in targeting, observed, "It really is like a factory. We work quickly and there is no time to delve deep into the target. The view is that we are judged according to how many targets we manage to generate."

I have to admit that when I read those quotes I kicked myself for not having previously realized, given all the military processes of routinization I've seen up close, that of course that would be the relationship between machine-target-generation and human-target-execution. 

It also seems worthwhile to look at Habsora in light of a different aspect of Israeli intelligence. Last week the New York Times reported that Israel acquired a document that functionally outlined the Oct. 7 Hamas attack over a year before it happened. Three months before, an analyst in Israel's NSA, Unit 8200, concluded that "an intense, daylong training exercise" by Hamas matched the "Jericho Wall" document, but was disregarded. The Times attributes the disregard to a belief that Hamas "lacked the capability to attack and would not dare to do so." Perhaps it's worth asking what similarly dubious assumptions the IDF is designing into its target-generation software. A common thread running through both cases seems to be an inability to understand Palestinians as human beings. 

Not every military application of AI will work like Habsora. Not every AI-enabled war will look like Gaza. As Abraham meticulously traces, Israel has embraced military doctrine that emphasizes disproportionate force, including against civilian targets, that not every state using AI will accept. (Disproportionate force and indistinction between civilians and combatants are, according to international humanitarian law, war crimes.) But every military that uses or seeks to use AI, whatever their government's position is on a ceasefire specifically or Israel/Palestine generally, will look at Gaza as a test case. "Other states are going to be watching and learning," a former White House official told The Guardian. 


OK, NOW ONTO THE WARNINGS of Israel's arms dealer. 

Over the weekend, Defense Secretary Lloyd Austin gave a much-noticed speech that many interpreted as waving Israel away from the sort of war it's fighting. Austin, who commanded the terminal phase of the 2003-11 U.S. occupation of Iraq and the opening phase of the 2014-2017 war against ISIS, offered the lesson "that you can only win in urban warfare by protecting civilians." 

The smell of bullshit here is really overwhelming. First, only compared to Israel in Gaza can we say the Iraq occupation and the subsequent war against ISIS "protected civilians." Second, and more urgently, the enterprise Austin runs is providing Israel with an unspecified number of 2,000-lb. BLU-109 bunker buster bombs, the Wall Street Journal recently revealed. An Israeli bombing that leveled an apartment building in the Jabalia refugee camp, described by the Journal as "one of the deadliest strikes of the entire war," used American ordnance. (I can't tell if the Journal is saying the specific munition used was a BLU-109.) 

As long as the Biden administration arms Israel in its war, nothing it says about the right or wrong way to use that weaponry matters. Austin is lecturing Israel about understanding that "the center of gravity is the civilian population." Well, Israel's response to the center-of-gravity problem is Habsora and Power Targeting. Austin warns Israeli leaders to "shun irresponsible rhetoric" as if his defense counterpart, Yoav Gallant, didn't say Israel was fighting "human animals" and as if Benjamin Netanyahu didn't invoke the divine command to annihilate the Amalekites. Austin and his Biden administration colleagues aren't detached observers here. They're accomplices.

Austin might also worry more about escalation involving U.S. forces. On Monday, the U.S. killed five militants in a drone strike near Kirkuk in Iraq whom it said were going to launch a drone to hit U.S. forces. And now there's a persistent naval component to this war, one that may prompt a multinational naval task force to escort commercial ships in the Red Sea – and the prospect of U.S. military action against the Iran-backed Houthi government in Yemen. Even Mohammed bin Salman is telling the U.S. not to overreact to the Houthis. 


TOMORROW, THURSDAY, DECEMBER 7, the organized workers of the Washington Post will engage in a 24-hour strike to compel management to return to good-faith contract negotiations. They ask that you not read, click, stream, etc., any material generated by the Post. Let's stand with them. This is the easiest picket line not to cross! 


STEPHEN SEMLER'S NEWSLETTER has been on fire lately. Here he is documenting a staggering $101 billion in Biden administration arms sales in just the last 11 months. Click through and you'll see that's an undercount. And here he is last week showing that Iranian proxies respected the quasi-ceasefire and held off on attacking U.S. forces. If you like FOREVER WARS, you'll like his newsletter—he can do infographics and I can't. 


NOT SICK OF ME TALKING? I bet I can get you there by the end of this section. I talked for nearly an hour about Henry Kissinger with Hasan Piker for his livestream. If you stay to the end you'll hear me not only plug WALLER VS. WILDSTORM but talk about my unpublishable pitch for a Sabra miniseries that Cerebro listeners have heard me reference. I was on al-Jazeera's podcast The Take talking about Kissinger's legacy. Jordan Uhl had me back for my third appearance on The Insurgents to talk about Kissinger. Marisa Kabas also did a fun Q&A with me about the mechanics of the Kissinger piece for her newsletter The Handbasket.


SPEAKING OF WALLER VS. WILDSTORM! Our DC Comics superhero spy thriller concludes next week, with the senses-shattering fourth issue in stores on Tuesday, December 12! At the end of issue 3, Adeline Kane remarks to Amanda Waller that they've won. But have they? And what will Lois Lane do with that tape recorder – to say nothing of all her reporting about Amanda in Gamorra! You don't want to miss this one, and you don't have to! Brooklyn's finest, Bulletproof Comics on Nostrand Avenue, has you covered on signed and certified copies of WALLER VS WILDSTORM, including full signed sets of all four issues—it's the perfect holiday gift for the FOREVER WARS reader! Act now because supplies are limited and these WILL sell out!