Face First: Researchers Gone Wild
27.08. - 19.09.2020
Opening - 27.08.2020, 11 am - 8 pm
“In response to this research project, 3 image training datasets were terminated, 2 citations were censored, 1 author apologized, and face recognition training datasets became a front page story” – This is how the artists Adam Harvey und Jules LaPlace summarize the early results of their collaborative art and research project “MegaPixels”. Harvey and LaPlace are examining different facial recognition databases, publishing them on their website and exploring their ethically dubious nature. To this end, they investigated the included pictures, the motivation of their inclusion, as well as the financial backing of the data collection efforts.
Selfies, profile pictures, pictures of celebrities, YouTube tutorials, video snapshots as well as authentic, “in the wild”1 recordings are used in building and expanding the picture libraries and datasets. The where, and how of the usage of these datasets, who can access and the why, are often unexplored and barely controlled. Among other uses, neural networks train on these datasets to improve facial recognition capabilities. The greater the quantity of pictures, variety of lighting conditions, and diversity of facial expressions the the more adept algorithms become at evaluating biometric information.
The New York Times commented on the work in fall 2019 under the title “Facial Recognition Tech Is Growing Stronger, Thanks to Your Face”. The Financial Times asked “Who’s Using Your Face” in reacting to the work and cited the research in the article.
Adam Harvey’s new works manifest as a follow up on these themes in the physical space. In his first solo exhibition “Face First: Researchers Gone Wild” at EIGEN + ART Lab in Berlin, Harvey is showing an installation of works that provide context and insights into his years-long research in the context of the “MegaPixels” project.The exhibition connects to the artists’ previous works that dealt with themes around computer vision, image recognition, privacy and surveillance.
What is the future of our pictures? Who do our pictures and their underlying data belong to? Who is allowed to use and archive our likeness? Harvey’s works show the unexpected use of our photos in training datasets and biometric research projects by academic, corporate, security, and military agencies – whether legally or illegally – and how they contribute to the growing crises of unregulated biometric surveillance technologies.
At first glance, the documentary series “Datageist: Duke MTMC” and “Datageist: Brainwash” appear merely to be images recorded from surveillance cameras on the campuses of Stanford and Duke University. Surveillance recordings are not subject to any formal rules of design and are not meant to be aesthetically pleasing. Harvey adds a heat map to these recordings, enabling the observer to recognize areas within the images where single or groups of people congregated. Harvey shows that, while single images in surveillance recordings seem to disappear in a stream of images and the university has removed the accompanying datasets, the images still circulate as ‘ghosts’ and have been archived numerous times.
“Research Cameos”, an installation on eight framed LED screens, shows the names of celebrities and public figures whose likeness has been used in facial recognition training databases.2 The word “cameo” refers to a small character part in a play or movie, played by a distinguished actor or a celebrity. The pictures follow a strict script. The name of the celebrity or public figures appears for a fleeting moment, before disappearing on the edge of the screen. At the same speed, pictures of celebrities or public figures appear on the internet. These pictures are especially suited to train facial recognition algorithms – not in the least because the status of public figure seems to legitimize the free use of the picture. Whoever is in the picture is playing a bit part. The abstracted image data alone is used as a breeding ground and fodder for copious databases.
By uploading and circulating pictures, publishing data and using search engines we become unwitting, unwilling and passive participants in our own surveillance and data analysis. In the time of facial recognition, every selfie can be used to produce a more rigorous biometric profile. That this data can be used to perfect facial recognition technology is plausible, but we may never know.
“If you knew that your image, your friend’s image, or your child’s image was being used for developing products in the defense industry, would you object?” (Adam Harvey)
¹ One of the first facial recognition databases was created based on studio photographs in the 1990s. However, scientists realized that these well-exposed and posed photos could hardly be used to train algorithms for face recognition. Resear- chers then pivoted to images “in the wild”; Pictures that were created in natural, diverse situations. The first comprehensive image database was created in 2007 under the title “Labeled Faces in the Wild”, which was fed from images from various news sites on the Internet. The images showed people in front of different backgrounds and lighting conditions, in different poses and from different perspectives.
² Microsoft Celeb (MS-Celeb-1M) is a database of 10 million facial images collected from the Internet for the develop- ment of facial recognition technologies. While the majority of the people in this dataset are American and British actors, many of the names in the MS Celeb facial recognition dataset are simply people who maintain an online presence for their professional lives: journalists, artists, musicians, activists, policymakers, writers, and academics: https://megapixels. cc/msceleb/
Photos: Eike Walkenhorst