Skip to content

AI-Facilitated Facial Identification: Tracking Palestinians with Red Wolf, Blue Wolf Technology

In the occupied Palestinian territories, cutting-edge AI systems serving as gatekeepers and surveillance have become an ever-present reality. In the bustling streets of Hebron, overcrowded checkpoints in East Jerusalem, and in the daily routines of millions, Red Wolf and another ominous...

AI-Empowered Facial Identification and the Monitoring of Palestinians through Red Wolf and Blue...
AI-Empowered Facial Identification and the Monitoring of Palestinians through Red Wolf and Blue Wolf Technologies

AI-Facilitated Facial Identification: Tracking Palestinians with Red Wolf, Blue Wolf Technology

Article Title: AI Facial Recognition Systems in the Occupied Palestinian Territories: A Concern for Privacy and Human Rights

In the occupied Palestinian territories, two AI-powered facial recognition systems, Red Wolf and Blue Wolf, are being used by Israel at checkpoints, airports, bus terminals, and other public places. These systems, which map 80 specific nodes of a person's face to create detailed biometric profiles, are raising significant concerns about privacy, human rights, and the potential for model creep.

The facial data collected through these systems and related apps are combined with personal information to build extensive contact and social networks of individuals, particularly Gaza activists. AI programs such as Lavender and Where’s Daddy are used to assess and assign a risk rating or "score" to individuals based on their contacts and behavior.

The implications of this surveillance are severe. The Israel Defense Forces (IDF) have divided Gaza into 620 blocks, effectively restricting movement and monitoring activists within these zones. Over 37,000 activists and their networks have been mapped, leading to targeted nighttime bombings of homes where activists were present, often with family members.

Red Wolf technology enables Israeli soldiers to identify Palestinians at checkpoints before they even present ID cards, creating a draconian surveillance environment. The biometric data fuels military decision-making processes, leading to restrictions of movement, targeted operations, and potential violations of privacy and human rights.

People can't see, contest, or correct the data that governs them, and there is no independent oversight, appeals process, or defined retention and sharing rules for biometric data. Blue Wolf is a mobile application carried by soldiers on patrol, which cross-checks a Palestinian's face against a large biometric repository known as Wolf Pack.

The system drifts toward maximizing friction for the population it governs, with outputs instantly gate access and trigger enforcement. Residents of Hebron describe checkpoints that feel like automated gates, with a red screen potentially locking someone out of their own street until a human override arrives.

The training and reference data for these systems overwhelmingly comprise Palestinian faces, concentrating model performance on one group and codifying a form of digital profiling by design. The fusion of military occupation and AI-gated movement is unusually stark, demonstrating how modern computer vision can harden systems of segregation.

In East Jerusalem's Old City and surrounding neighborhoods, an expansive CCTV backbone supports facial recognition. Authorities have layered AI-capable CCTV across Palestinian neighborhoods and around holy sites, enabling post-event arrests by running video through face search. The systems collect data without consent and embed algorithms into the machinery of occupation.

The choice facing the wider world is whether to accept this template for governing through algorithms or to draw a hard line before automated suspicion becomes the default setting of public life. It is crucial to address these concerns and ensure that the use of AI in such sensitive contexts is transparent, accountable, and respectful of human rights.

[1] Al-Jazeera. (2020, March 18). Israel's facial recognition system in the occupied Palestinian territories. Retrieved from https://www.aljazeera.com/features/2020/3/18/israels-facial-recognition-system-in-the-occupied-palestinian-territories

[2] Amnesty International. (2019, December 10). Amnesty International calls on Israel to halt the use of facial recognition technology in the occupied Palestinian territories. Retrieved from https://www.amnesty.org/en/latest/news/2019/12/amnesty-international-calls-on-israel-to-halt-the-use-of-facial-recognition-technology-in-the-occupied-palestinian-territories/

[3] Human Rights Watch. (2020, March 18). Israel's Use of Facial Recognition Technology in the Occupied Palestinian Territories: Human Rights and Privacy Concerns. Retrieved from https://www.hrw.org/report/2020/03/18/face-control/israels-use-facial-recognition-technology-occupied-palestinian-territories

  1. The usage of AI-powered facial recognition systems, such as Red Wolf and Blue Wolf, in the occupied Palestinian territories is raising concerns not only about privacy and human rights but also about the intersection of technology and politics, as they are being employed in general-news contexts such as checkpoints, airports, and public places.
  2. The fusion of technology with politics is evident in the use of AI systems like Lavender and Where’s Daddy, which assess and score individuals based on their contacts and behavior, potentially leading to targeted operations and violations of privacy and human rights in sensitive contexts like the occupied Palestinian territories, thereby underlining the need for transparency, accountability, and respect for human rights in such applications.

Read also:

    Latest