Many civil rights organizations and research institutions have raised questions about the efficacy and fairness of these systems’ reliance on historical data. AI that is based primarily on arrests data carries a higher risk of bias since it is reflective more of police decisions than actual crimes. It is, in short, potentially dirty data. Critics also point to the dangers of a “feedback loop”—the results of the algorithms both reflecting and reinforcing attitudes about which neighborhoods are “bad” and which are “good.” Predictions tend to turn into self-fulfilling prophesies; if the findings raise the expectation of crimes in certain neighborhoods, police are more likely to patrol these areas and make arrests. Furthermore, the lack of transparency and public accountability that comes with such proprietary software prevents proper analysis and understanding of algorithmic recommendations. The artist and researcher Mimi Onuoha termed this “algorithmic violence”: invisible, automated decision-making processes that “affect the ways and degrees to which people are able to live their everyday lives” and that legitimatize hierarchy and inequality. Provoking viewers to recognize their (un)witting complicity in these systemic processes, Thompson’s narration concludes with a call to action: “The red square has also been a place of revolution. We decide which we will become: Prisoners or Revolutionaries. Democracy is fragile.”
The third part of the installation forces the viewer’s hand; a web portal based on the same algorithm as PredPol predicts the likelihood of white-collar crimes according to zip code. The implication is clear: data identity is shaped not by algorithms but by a multitude of social and political forces—both visible and hidden—that we must continue to battle. “Art,” Hershman Leeson once said, “is a form of encryption.” Shadow Stalker challenges viewers to take control of their rapidly forming digital identities, before it is too late.
Text by Claudia Schmuckli, Curator in Charge of Contemporary Art and Programming, Fine Arts Museums of San Francisco; from Beyond the Uncanny Valley: Being Human in the Age of AI, Fine Arts Museums of San Francisco. Available for purchase through the Museums Stores.
Learn more about Uncanny Valley at the de Young.
Further Reading
- Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (New York: Public Affairs, 2019).
- Rashida Richardson, Jason M. Schultz, and Kate Crawford, “Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice, New York University Law Review 94, no. 192 (May 2019): 192–233, https://www.nyulawreview.org/wp-content/uploads/2019/04/NYULawReview-94-....
- Mimi Onuoha, “Notes on Algorithmic Violence,” GitHub, last updated February 22, 2018, https://github.com/MimiOnuoha/On-Algorithmic-Violence.
- “Art Is a Form of Encryption: Laura Poitras in Conversation with Lynn Hershman Leeson,” PEN America, August 23, 2016, https://pen.org/art-is-a-form-of-encryption-laura-poitras-in-conversatio....