Kill Lists In The Age of Artificial Intelligence

Israel's "Lavender" AI, the lethal end result of mass surveillance, debuted in Gaza six months ago. It threatens to be the future of war. 

Kill Lists In The Age of Artificial Intelligence
"Lavender field in Drôme" by John Samuel. CC-BY-SA 4.0

Israel's "Lavender" AI, the lethal end result of mass surveillance, debuted in Gaza six months ago. It threatens to be the future of war. 

Edited by Sam Thielman 


"I would also remind you, sir, that we continue to look at incidents as they occur. The State Department has a process in place. And to date, as you and I are speaking, they have not found any incidents where the Israelis have violated international humanitarian law." 

White House national-security spokesman John Kirby, April 3

“In the bombing of the commander of the Shuja’iya Battalion, we knew that we would kill over 100 civilians,” B. recalled of a Dec. 2 bombing that the IDF Spokesperson said was aimed at assassinating Wisam Farhat. “For me, psychologically, it was unusual. Over 100 civilians — it crosses some red line.”

"B," a senior Israel Defense Forces officer, to Yuval Abraham at +972, published April 4

NOT GOING TO BE A LONG ONE FROM ME TODAY, as I'm balancing book writing with several projects I can't yet reveal, but Yuval Abraham has published his second stunning piece of journalism during the collective punishment of Gaza. Like his last one, it's profound in its implications both for Gaza and for the future of warfare. So I had to stop what I was doing. 

In November, Abraham revealed the existence of an artificial-intelligence target generation system called Habsora ("the Gospel"). I remarked in FOREVER WARS that Habsora underscored that the point of militarized AI is not precision but scale. This time, Abraham, citing six Israeli military officials experienced using the system, reveals another targeting AI, this one called Lavender. Lavender is basically what those of us who worked on the Edward Snowden documents warned was on the horizon as the end result of mass surveillance. "We kill people based on metadata," the former NSA and CIA director Mike Hayden once acknowledged. Lavender, as reported, does that at scale. It's an AI for Signature Strikes

The IDF officers who describe the system present it as effectively the end result of mass surveillance. Only through the collection and retention of pattern-of-life data on unfathomable scale—the sort that Israel routinely collects on Palestinians, including in Gaza—could Lavender generate "a rating from 1 to 100, expressing how likely it is that they are a militant." Lavender chews up patterns, habits and characteristics of Hamas or Palestinian Islamic Jihad and spits out likelihoods within the "general population," meaning 2.3 million Gazans, of who might be a militant, a term the officers stress is not rigorously defined. Then those people are hunted and killed, often deliberately when they are in close proximity to their families. If you've seen Captain America: The Winter Soldier, Lavender is a real-life Project Insight

Practitioners describe Lavender as likely to get its target identification wrong. But "wrong" isn't really the right way of understanding what's happening when Lavender misidentifies a target as a militant. Instead, Lavender reflects the lax parameters set around target definition by its operators, and scales them up. The machine makes an assessment about who fits patterns of militancy, or did at one point in their life. The military treats those assessments as definitive, possessing as they do the imprimatur of large-scale machine learning—even though, as Abraham writes, "verification was largely abandoned in favor of automation." 

Children have been acceptable targets produced by Lavender—a violation that happens to corroborate a piece the Guardian ran on Tuesday. "Normally, operatives are over the age of 17, but that was not a condition" in Gaza after Oct. 7  according to one of Abraham's sources. Again, that's not a mistake. That's the reflection of a parameter programmed into Lavender. 

Here's another one of Abraham's sources that used Lavender:

"There was a completely permissive policy regarding the casualties of [bombing] operations — so permissive that in my opinion it had an element of revenge," D., an intelligence source, claimed. "The core of this was the assassinations of senior [Hamas and PIJ commanders] for whom they were willing to kill hundreds of civilians. We had a calculation: how many for a brigade commander, how many for a battalion commander, and so on."

For over a decade, military AI circles have wrung their hands over a binary: whether or not a human being is "in the loop" in determining the use of lethal force. But placing a human in the loop is no sufficient ethical safeguard. A human is still in the military loop with Lavender, but a human brain is no longer necessary. Abraham reports that the commander of Unit 8200, Israel's NSA, considers that a virtue: 

Describing human personnel as a “bottleneck” that limits the army’s capacity during a military operation, The commander laments: “We [humans] cannot process so much information. It doesn’t matter how many people you have tasked to produce targets during the war — you still cannot produce enough targets per day.”

Lavender, needless to say, has no such constraints. "[D]uring the first weeks of the war, the army almost completely relied on Lavender, which clocked as many as 37,000 Palestinians as suspected militants," Abraham writes."

While Abraham reports that Lavender has come under greater restriction since the Biden administration began public criticism of Israel, anyone who looks at Gaza can see that target replenishment is not a problem for the IDF. He reports that within the first weeks of the war, not only were junior Hamas operatives permissible targets, but "it was permissible to kill up to 15 or 20 civilians" in the course of killing one. Another program, this one with the obscene name Where's Daddy, reportedly pings when targets are assessed to be physically proximate to their families, at which point, Abraham reports, the IDF chooses to kill them. "[S]everal sources emphasized that, as opposed to numerous cases of Hamas operatives engaging in military activity from civilian areas, in the case of systematic assassination strikes, the army routinely made the active choice to bomb suspected militants when inside civilian households from which no military activity took place," Abraham notes. 

I hesitate to use the word assassination. I do so for the same reason I did when trying to explain the CIA's Signature Strikes. While the slaying is in some sense tied to a target, the target is not necessarily a person whose identity is known to the targeters. Instead, the target is someone whom a process assesses to fit a pattern similar to those of known targets. 

It is hard at this point not to see Gaza as a laboratory for future warfare. It's not just Lavender, and it's not just Habsora, it's also the innovation of the small, armed quadcopter drone, about which I may be writing more in a few weeks. The Biden administration continues to pursue its One Big Regional Idea, which is to secure Arab (in this case Saudi) recognition of Israel, and a principal reason Gulf states are open to this is to acquire Israeli military and surveillance technology. 

More broadly, technological innovations in warfare go where the markets for them are, and the markets for them change. This is how police precincts become "peacetime" customers for battlefield surveillance and tactical equipment. Just the other day there was a piece in Bloomberg about how "Deadly Drones Are Changing the Face of Warfare in Africa," and that simply was not something anyone at the CIA or the Air Force was thinking about a generation ago when they debuted weaponized drones. And the present conditions of mass surveillance enable Lavender to bloom very, very widely. 


I DON'T KNOW HOW MANY TIMES I can write that the U.S. needs to engage in sustained, high-level diplomacy with Iran. But now that Israel has assassinated an Iranian general in a strike on Iran's consulate in Damascus, and also now that Iran-backed militias are starting to conduct drone strikes on Israeli naval installations, it would be really great if those could get under way.


WALLER VS. WILDSTORM, the superhero spy thriller I co-wrote with my friend Evan Narcisse and which the masterful Jesús Merino illustrated, is available for purchase in a hardcover edition! If you don't have single issues of WVW and you want a four-issue set signed by me, they're going fast at Bulletproof Comics

No one is prouder of WVW than her older sibling, REIGN OF TERROR: HOW THE 9/11 ERA DESTABILIZED AMERICA AND PRODUCED TRUMP, which is available now in hardcover, softcover, audiobook and Kindle edition. And on the way is a new addition to the family: THE TORTURE AND DELIVERANCE OF MAJID KHAN.