Bureau de Crise
(last edited: 2022)
author(s): worker 0001
This exposition is in progress and its share status is: visible to all.
Bureau de Crise
(2019-ongoing)
Privacy-focused digital research platform and network.
Bureau de Crise is a research network of artists, engineers, designers, and concerned citizens who map out privacy issues in our digital society governed by the opaque laws of surveillance capitalism and aim to cultivate a sense of autonomy based on aspirations for both individual and collective freedom and to act collectively on self-empowerment.
Bureau de Crise intends to contribute to a moral center of critical thinking and autonomous actions. Since privacy and security are now primarily a luxury privilege for the more educated elite, we want to make privacy accessible and visible for everyone to end the vast asymmetries of knowledge institutionalized by the private surveillance capital. Surveillance capitalism operative systems constitute a direct assault on human rights (privacy), human autonomy, our decision rights, and the notion of individual sovereignty. ‘Chilling effects’ (i.e., behavioral changes), induced by strong social norms that lead to self-policing and self-censorship, directly impact our subjectivities and our democracies.
We are claiming the right of human invisibility and are collaborating in the art and science of hiding. Thus, Bureau de Crise aims to contribute to creating a culture of digital privacy by hacking the current general lack of knowledge and the actual dominant ‘inevitability’ propaganda.
Furthermore, our artistic practice aims to empower people’s ability to protect their data from the dark mechanisms of data/behavior commodification. Therefore, we need to invent new kinds of cultural institutions shaped by critical communities working on strategies of resistance against these architectures of oppression and techniques of cognitive and psychological captures secretly designed to alienate and make profits on remote-controlled and psycho-civilized populations.
Finally, we aim to address the need for new regulations, legal safeguards, and ethical limits to interrupt these specific mechanisms and limit data collection and misuse.
Patterns-of-Life
(last edited: 2024)
author(s): Revuelta
This exposition is in progress and its share status is: visible to all.
In 2017, the U.S. Department of Defense announced the launch of Project Maven, also known as the Algorithmic Warfare Cross-Functional Team (AWCFT). This program aims to integrate artificial intelligence (AI) and machine learning into military operations to maintain a strategic advantage over increasingly sophisticated adversaries. The primary objective of Maven is to develop an automated analysis system capable of identifying targets and spotting suspicious activities through machine learning algorithms and advanced computer vision techniques.
One year after its launch, the DoD revealed that the program was using an AI algorithm to autonomously recognize and identify targets, marking a significant advancement towards achieving one of Maven's objectives: to develop and integrate computer vision algorithms to assist military analysts overwhelmed by a constant flow of video data.
Inspired by the Maven project, “Patterns-of-Life" offers an immersion into the perspective of a military drone, embodying the algorithmic eye of war. Trained on real combat videos collected from social networks and YouTube, the videos present a pattern recognition system to predict human behavior from movements and postures, assessing the hostile or neutral potential of a target. It thus explores the concept of "lethal autonomous targeting" by an AI system capable of differentiating between combatant actions and civilians, presented in an urban context where technology is revealed in everyday scenes.
“Patterns-of-Life” emphasizes the fundamental importance, stated in the Geneva Conventions, of distinguishing between civilians and combatants. A mission that, once, was the responsibility of commanders and soldiers, now relies on software analyzing data collected by sensors.
Patterns-of-Life invites us to reflect on the role we want technology to play in resolving conflicts and preserving innocent lives. It leads us to question the future of increasingly automated warfare and the issues of responsibility that arise. It asks: will life-or-death decisions be entirely delegated to machines? How will we recognize this shift?