Posted by z3d in ArtInt

A report by Jerusalem-based investigative journalists published in +972 magazine finds that AI targeting systems have played a key role in identifying – and potentially misidentifying – tens of thousands of targets in Gaza. This suggests that autonomous warfare is no longer a future scenario. It is already here and the consequences are horrifying.

There are two technologies in question. The first, “Lavender”, is an AI recommendation system designed to use algorithms to identify Hamas operatives as targets. The second, the grotesquely named “Where’s Daddy?”, is a system which tracks targets geographically so that they can be followed into their family residences before being attacked. Together, these two systems constitute an automation of the find-fix-track-target components of what is known by the modern military as the “kill chain”.

Such systems are trained on a set of data to produce the profile of a Hamas operative. This could be data about gender, age, appearance, movement patterns, social network relationships, accessories, and other “relevant features”. They then work to match actual Palestinians to this profile by degree of fit. The category of what constitutes relevant features of a target can be set as stringently or as loosely as is desired. In the case of Lavender, it seems one of the key equations was “male equals militant”. This has echoes of the infamous “all military-aged males are potential targets” mandate of the 2010 US drone wars in which the Obama administration identified and assassinated hundreds of people designated as enemies “based on metadata”.

2

Comments

You must log in or register to comment.

righttoprivacy wrote

The testing grounds for so much of the dystopian tech occurs in Gaza. Next, lobbied to a police dept near you (big Israeli industry).

1