“Patterns-of-Life”
(2023-2024)


Thermal Imaging and Autonomous Targeting Systems

Preliminary Research Phase for the 'Kill Cloud Room' Project


In 2017, the U.S. Department of Defense announced the launch of Project Maven, also known as the Algorithmic Warfare Cross-Functional Team (AWCFT). This program aims to integrate artificial intelligence (AI) and machine learning into military operations to maintain a strategic advantage over increasingly sophisticated adversaries. The primary objective of Maven is to develop an automated analysis system capable of identifying targets and spotting suspicious activities through machine learning algorithms and advanced computer vision techniques. 


One year after its launch, the DoD revealed that the program was using an AI algorithm to autonomously recognize and identify targets, marking a significant advancement towards achieving one of Maven's objectives: to develop and integrate computer vision algorithms to assist military analysts overwhelmed by a constant flow of video data.


Inspired by the Maven project, “Patterns-of-Life" offers an immersion into the perspective of a military drone, embodying the algorithmic eye of war. Trained on real combat videos collected from social networks and YouTube, the videos present a pattern recognition system to predict human behavior from movements and postures, assessing the hostile or neutral potential of a target. It thus explores the concept of "lethal autonomous targeting" by an AI system capable of differentiating between combatant actions and civilians, presented in an urban context where technology is revealed in everyday scenes.


“Patterns-of-Life” emphasizes the fundamental importance, stated in the Geneva Conventions, of distinguishing between civilians and combatants. A mission that, once, was the responsibility of commanders and soldiers, now relies on software analyzing data collected by sensors. 


Patterns-of-Life invites us to reflect on the role we want technology to play in resolving conflicts and preserving innocent lives. It leads us to question the future of increasingly automated warfare and the issues of responsibility that arise. It asks: will life-or-death decisions be entirely delegated to machines? How will we recognize this shift?


Technologies used:

Thermal Imagery:
In the context of introducing an autonomous targeting system capable of detecting individual combatant actions/movements and identifying them as targets within an urban setting, and despite being part of an artistic installation, it is impossible to solicit the consent of participants for the use of their images.


Given the complexity of ethical issues, particularly concerning privacy and anonymization, we have chosen thermal or infrared imaging instead of optical images to capture these scenarios in the drone installation. This approach allows us to represent targets in an impersonal manner, making them less recognizable, while staying as close to military realities as possible.


This strategy aims to meet the anonymization requirements of individuals involved in the physical and virtual installation, with the goal of preserving their privacy, while highlighting the issue of dehumanizing targets in the context of drone usage in times of war.

Custom Autonomous Targeting System:
This system is built upon advanced human pose estimation algorithms and an Human Activity Recognition (HAR) system capable of automatically recognizing and categorizing human actions in videos which includes:

  • Human Detection Algorithm (Yolo): Specifically trained with a unique dataset to accurately detect humans in thermal images and videos.

  • Human Action Recognition model training with our custom dataset: Capable of differentiating and analyzing actions of combatants and civilians.


Tracking Algorithms: These are specially tailored for the thermal camera mounted on a gimbal, enhancing the precision in tracking potential targets.


In collaboration with Laurent Weingart.

© 2024 revuelta.ch