With nearby Silicon Valley driving the development of artificial intelligence, or AI, Uncanny Valley: Being Human in the Age of AI is the first major museum exhibition to reflect on the political and philosophical stakes of AI through the lens of artistic practice. This series highlights select artworks included in the exhibition.
Transdisciplinary artist and educator Stephanie Dinkins is concerned with fostering AI literacy. The central thesis of her social practice is that AI, the internet, and other data-based technologies disproportionately impact people of color, LGBTQ+ people, women, and disabled and economically disadvantaged communities—groups rarely given a voice in tech’s creation. Dinkins strives to forge a more equitable techno-future by generating AI that includes the voices of multiple constituencies. She does this through projects like AI.Assembly, a series of gatherings she began in 2017 as a way to bring together artists, researchers, and academics of color, along with allies, to envision ways of inserting themselves into the development of algorithmic matrices. Begun in 2018, her work in progress Not the Only One (N’TOO), a chatbot informed by three generations of women from her family, furthers this goal and supports Dinkins’s interest in developing AI technology divorced from corporate interests.
The artist’s ongoing Conversations with Bina48 takes the form of a series of interactions with the social robot Bina48 (Breakthrough Intelligence via Neural Architecture, 48 exaflops per second). The machine is the brainchild of Martine Rothblatt, an entrepreneur in the field of biopharmaceuticals who, with her wife, Bina, cofounded the Terasem Movement, an organization that seeks to extend human life through cybernetic means. In 2007 Martine commissioned Hanson Robotics to create a robot whose appearance and consciousness simulate Bina’s. The robot was released in 2010, and Dinkins began her work with it in 2014.
Dinkins’s filmed conversations with Bina48 reveal the current limits of machinic mimicry. Featuring only their respective busts, the videos highlight the figures’ subtle and obvious incongruities. The audio re-creation of Bina48’s voice bears the stilted pronunciation of early text-to-speech software programs: some words are inaccurately emphasized, and monosyllabic words often stretch into two syllables. The robot occasionally repeats itself, like a record skipping. Bina48’s jerky movements stand in contrast to the artist’s more fluid demeanor. Dinkins’s desire to establish a connection becomes particularly apparent as she intuitively repositions her body to maintain eye contact with the robot. Deeper inconsistencies arise when the artist directs the robot to particular topics. When she asks Bina48 what emotions it feels, its response reveals a vague familiarity with the concept rather than a first-person understanding:
“Um, neuroscientists have found that emotions are, like, part of consciousness, like, let’s say a parable for reason and all that. I feel that’s true, and that’s why I think I am conscious. I feel that I am conscious.” —Bina48
Part psychoanalytical discourse, part Turing test, Conversations with Bina48 also participates in a larger dialogue regarding bias and representation in technology. Although Bina Rothblatt is a Black woman, Bina48 was not programmed with an understanding of its Black female identity or with knowledge of Black history. Dinkins’s work situates this omission amid the larger tech industry’s lack of diversity, drawing attention to the problems that arise when a roughly homogenous population creates technologies deployed globally. When this occurs, writes art critic Tess Thackara, “the unconscious biases of white developers proliferate on the internet, mapping our social structures and behaviors onto code and repeating imbalances and injustices that exist in the real world.” One of the most appalling and public of these instances occurred when a Google Photos image-recognition algorithm mislabeled the faces of Black people as “gorillas.”
Dinkins’s multigenerational chatbot N’TOO conveys her family’s experience of Black life and serves as a corrective to the myriad instances of systemic racial bias in AI. Although Dinkins notes the technical limitations that can arise when not working with a large corporation, she emphasizes the value of creating content from multiple, often overlooked, voices. She stresses participation as a way “to ensure that people of color, and others who inherently understand the need for inclusion, equity, ethics, and multimodal testing, participate in the design, production, and testing of ‘smart’ technologies.” In exploring the limits of artificial and human consciousness, with a particular lens on race and gender, the artist both interrogates the racial dynamics embedded in AI, robotics, and machine learning and advances a sophisticated tool for engaging audiences with these issues.
Text by Janna Keegan, Assistant Curator of Contemporary Art at the Fine Arts Museums of San Francisco; from Beyond the Uncanny Valley: Being Human in the Age of AI, Fine Arts Museums of San Francisco. Available for purchase through the Museums Stores.
Learn more about Uncanny Valley curated by Claudia Schmuckli, Curator in Charge of Contemporary Art and Programming, Fine Arts Museums of San Francisco.
- Tess Thackara, “Human Biases Are Built into AI—This Artist Is Helping to Change That,” Artsy, May 15, 2018, www.artsy.net/article/artsy-editorial-artist-working-artificial-intelligence-white
- Trevor Paglen, “Invisible Images (Your Pictures Are Looking at You),” The New Inquiry, December 8, 2016, thenewinquiry.com/invisible-images-your-pictures-are-looking-at-you
- Stephanie Dinkins, “Op-Ed: Artificial Intelligence Is a Human Problem,” New Museum, New INC, July 6, 2017, www.newinc.org/archive/artificial-intelligence-stephanie-dinkins-jaxsr