Killer AI or killer states? The use of AI in surveillance and genocide in Palestine by Israeli authorities and its global context
23. August 2024
The text is based on the discussion from the panel that we organised at Festival Grounded in Ljubljana, August 2024. We thank the panellists for their contributions to the discussions and this summary. Because of privacy and security concerns we have decided not to publish their names.
Palestinian city under Israeli surveillance cameras. Generated with Adobe Firefly.
Israel has for decades subjected Palestinians to invasive surveillance that infringes on their fundamental rights. Unfortunately, with the progress of artificial intelligence technologies or AI, which has occurred in recent years mainly due to more powerful computers, cloud services and enormous amounts of data, the technologies of control and repression are also advancing. We can see this clearly manifested in the Israeli attacks on Palestine – both in Gaza and in the West Bank.
Israel is using a multitude of different AI-related tools to support apartheid and to surveil and kill. The autonomous drones and automated border control machine guns have been used in the past, as well as extensive AI-based biometric surveillance systems, specifically in the West Bank. Israel’s technological, militaristic, and economic supremacy is determining how AI-related technologies are integrated into Palestinian life, posing critical threats to Palestinian human rights.
Travelling around the West Bank as a Palestinian is like moving around an open air panopticon. Palestinians are subjugated to a permit system run by Israeli occupation that is partially operated through Al-Munasiq app. These types of apps collect information on Palestinians, both directly related to the permit but also much wider, and share them with Israeli security services. At the same time, Israel also controls population registry and issues identity cards to Palestinians living in Occupied Palestinian Territories in the West Bank, Gaza and East Jerusalem that determine one's degree of freedom.
Palestinians have to pass Israeli check-points on a daily basis, just to move from one Palestinian town to another. These checkpoints were previously more “analogue”, but they are becoming more and more dependent on various technologies. While passing through a checkpoint, the IDF soldiers take pictures of one’s car, number plate, and face. They ask questions that Palestinians need to answer in order for them to be allowed to pass through. All the data and information gathered through a mobile phone app called Blue Wolf, a homegrown facial recognition system, is inputted into a vast database called Wolf Pack, containing all available information on Palestinians, including where they live, who their family members are, and whether they are wanted for questioning by Israeli authorities. At the same time Blue Wolf is used to instantly pull up the information stored in the Wolf Pack database. To encourage IDF soldiers to collect more data on more Palestinians, the Blue Wolf app usage is even gamified – soldiers that collect the most data can gain extra work perks, like more off-work hours.
Temple Mount Israeli checkpoint. Wikipedia Commons.
At checkpoints in West Bank cities, such as Hebron that was changed by a Smart City project which riddled it with AI-run 360 cameras and automated checkpoints equipped with AI-powered weapons, and in East Jerusalem a more automated data gathering system is operating. System named Red Wolf scans Palestinians without their consent by high-resolution cameras and compares them with biometric entries in Wolf Pack, before giving them permission to pass. If no entry exists for an individual, they will be denied passage. Red Wolf also denies passage based on other information stored on Palestinians, for example if a person is wanted for questioning or arrest. Feeling of being constantly watched is thus all too real. Even if the world's governments are trying to persuade us that if you have nothing to hide, you have nothing to fear, meaning that the AI-based surveillance and warfare technologies won’t affect you, the daily experiences of Palestinians should teach you otherwise.
If before the current assault Gaza had not yet been subjected to such extensive techno-surveillance, Israel quickly took advantage of the situation to collect more data with which it feeds AI systems based on predictive analytics. There are 3 such infamous systems that we currently know of: The Gospel that generates infrastructure targets, Lavender that generates human targets, and Where’s Daddy? that identifies those targets when they are at home and marks them for attacks. The way these tools work is very telling of Israel’s modus operandi: Lavender for example takes the information collected through a comprehensive system of mass surveillance (CCTV cameras, biometric checkpoints, databases and even the use of Google photos for facial recognition). Once the information is extracted, it is then analysed against certain characteristics to assign Gazans a score of 1 to 100. Life or death is thus decided by information such as gender, address, age, family and membership in WhatsApp groups.
Destruction in Gaza as seen via a computer vision algorithm. Danes je nov dan.
Through developing, testing and selling tech supported military equipment, Israel has transformed something that was historically a political problem and something not to be proud of – colonialism, occupation, repression of peoples – into something that is marketed as a great model of population control and internal security. Israel promotes itself as a startup nation producing urban warfare tools for the ever-expanding areas of counterterrorism and securitisation. Many governments are willing to buy into this idea and use Israeli produced tech that is marketed as battle-tested. Israel has tapped into a market opportunity, but its export success is inextricably linked to military, economic and political support from the US.
While the brutalism of Israel’s use of technology for malicious purposes is maybe shocking, it’s compelling for many other nation states. The aim of the Israeli military-industrial complex, that consists of the state and private companies, is not solely to kill indiscriminately, but is also to use the occupied Palestine as a testing ground for weapons and surveillance technologies that are later sold on the global market to other countries (authoritarian and democratic).
Technologies used in surveillance and warfare are utilised to minimise personal contact of Israeli military with Palestinian population – not to inflict less damage, but to protect soldiers from Palestinian resistance, as well as to protect the reputation of Israeli occupation under the guise of objectivity. AI accelerates the speed of warfare in terms of the number of targets produced and the time to decide on them. While these systems inherently decrease the ability of humans to control the validity of computer-generated targets, they simultaneously make these decisions appear more objective and statistically correct due to the value that we generally ascribe to computer-based systems and their outcome. The so-called AI washing – the industry and governments proclaiming that AI is going to solve some of the most complex social, economic and political issues, which immediately absolves them from taking any responsibility for whatever horrifying consequences the use of AI tech had on people's lives – is blatantly obvious in the case of Israel.
But, the above described use of AI-powered weapons carries severe potential for violating human rights as codified in the International Covenant on Civil and Political Rights. The indiscriminate attacks that we’ve seen in Gaza, executed with the use of Gospel, Lavender and Where’s Daddy?, point to a violation of the various rights, and, according to the International Court of Justice, together with several other factors, such as the genocidal rhetoric, forced starvation and the complete siege of Gaza, even indicate the plausibility of the crime of genocide, the ultimate violation of human rights.
AI-assisted warfare tools or checkpoints with AI surveillance systems and autonomous guns that can track people and fire tear gas, stun grenades and sponge-tipped bullets set dangerous precedents as to how AI-related technologies can have detrimental effects on civilian populations and warrant an emergency response from the global community to safeguard internationally protected human rights and other rules of international law aimed at protecting civilians. Placing the human being at the centre of any choice to take a life is one of the few new international standards around AI in conflict – whether or not to kill a living, breathing human should thus not be decided by machines or algorithms. But, because of Israel's application of AI in Gaza, this norm is being undermined before it has had a chance to completely establish itself. If the international community doesn’t act, the creeping normalisation of such practices, as already happened in the past with the use of new satellite technology for war purposes in Gulf Wars that was never questioned, will also happen to AI-powered warfare tools.
Notwithstanding all that was written above, we can’t overlook the broader context. What Israel is doing is not happening because of technology, but because technology is being used by a country that has dehumanised Palestinians for decades. Artificial intelligence is just the latest in a line of tools that serve the same genocidal goal.
Recommended reading list
- 7amleh Violence Indicator
- 7amleh: you can find reports, news and updates + you can sign up to their newsletter to stay in the loop!
- +972 and Local call: ‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza
- +97 and Local call: ‘A mass assassination factory’: Inside Israel’s calculated bombing of Gaza
- Access Now: Artificial Genocidal Intelligence: how Israel is automating human rights abuses and war crimes
- Amnesty International: Israel/OPT: Israeli authorities are using facial recognition technology to entrench apartheid
- Amnesty International: Automated Apartheid: How facial recognition fragments, segregates and controls Palestinians in the OPT
- Cambridge/International Journal of Middle East Studies: Algorithmic State Violence: Automated Surveillance and Palestinian Dispossession in Hebron’s Old City
- The Dialogue Box: Red Wolf and the Surveillance State: Investigating the Human Rights Implications of AI-Powered Facial Recognition Technology in Palestine
- Euro News: Israel deploys AI-powered robot guns that can track targets in the West Bank
- The Guardian: How Israel uses facial-recognition systems in Gaza and beyond
- Khelil, Khaldoun, Center for International Policy: AI and Israel’s Dystopian Promise of War without Responsibility
- Lowenstein, Antony, The Palestine Laboratory: How Israel Exports the Technology of Occupation Around the World
- Middle East Eye: Wolf Pack: Israel's accelerated use of facial recognition is ‘automated apartheid'
- Mint Press: Israel Experiments on Palestinians With Ai-Powered Gun at Checkpoints
- New Arab: Israel steps up its dehumanisation of Palestinians with new biometric checkpoints in the West Bank
- NY Times: Israel Deploys Expansive Facial Recognition Program in Gaza
- Turning Point Mag: From Lasers to Lavender: Will Israel’s Dual-Use Technology Lead To Dual-Use Societies?
Worth your attention
Invitations, tips, suggestions, and warnings
Ni naključje, da večina svetovnih voditeljev trdno stoji za Izraelom in podpira njegove zločine. Za javnimi izjavami politikov, ki nikakor ne obsojajo ali krivijo izraelske politike, ampak so, kot da imamo opraviti z naravno katastrofo, zgolj zgroženi in žalostni nad uničenjem Gaze, se skrivajo strateške in močne povezave z okupacijsko silo. Zahod nikakor ni nema priča pokola in kraje zemlje, ampak je s svojo politiko ekonomskega in vojaškega sodelovanja aktivni partner okupacije.
28. November 2014
Uredba Evropskega parlamenta in Sveta o določitvi harmoniziranih pravil o umetni inteligenci (akt o umetni inteligenci) mora enakovredno zaščititi vse ljudi pred nevarnostmi diskriminatornih sistemov umetne inteligence, vključno s tistimi osebami, ki zaprosijo za vizum ali dovoljenje za prebivanje, iščejo azil ali živijo z neurejenim migrantskim statusom.
15. September 2023