Once a bustling shipbuilding hub during the 1940s, San Francisco’s Pier 70 is now a site of dilapidated, inaccessible warehouses — what some consider a symbol of obsolescence neighboring America’s innovation capital, Silicon Valley. Appropriately, the artist Trevor Paglen — whose work examines ethical issues within newer technologies; mass surveillance, artificial intelligence, and drone warfare, among them — was commissioned to create a new performance at one of the pier’s warehouses. The location, the audience (a mix of creative and tech elites), the timing of an anticipated art week in the city, and the subject matter of how machines “see” musical performance, created a perfect platform to analyze the present and future role of machines within society.
Made in collaboration with creative studio Obscura Digital and the Grammy-winning Kronos Quartet, Sight Machine pairs manufacturing, surveillance, and face recognition technology with classical, American blues, and African folk music. The early part of the performance is whimsical and humorous; a screen above the quartet displays an array of readings, from colorful shapes clustering around the instruments to false, captioned interpretations about the performers themselves. (One caption reads, “Sunny is 72.72% female”; another says, “Sunny is holding a cell phone,” when she is really holding a cello.)
The performance, too, reveals darker undertones when footage from inside a satellite during its lift-off, followed by aerials shots of the earth from high above, eventually turns into a city being targeted by a drone. Narrating all of this is a woman’s voice from Terry Riley's Sun Rings project, originally commissioned by NASA, repeating the line, “One earth, one people, one love.”
If Paglen set out to answer how machines interpret art and whether or not it can be considered “accurate,” then he has succeeded. Sight Machine is a reminder that we must continue to challenge and monitor the role of technology in society, and that, though robotics and AI will have growing roles in our world, machines crucially lack subjective human emotion and the ability to feel.
This was most evident when the Kronos Quartet asked the audience if they could play one more song. The machines were shut off, and the foursome performed Vladimir Martynov’s emotive “The Beatitudes” with eyes closed and nothing short of pure fervor. In that moment, it was crystal clear that the role of emotion in playing and listening to music proves a distinct advantage that humans have over machines. It served as a heady prompt that many in the crowd could do well to remember.
On the heels of Paglen’s newly announced residency at Stanford’s Cantor Art Center, where he will continue to research artificial intelligence and machine vision tech, The FADER talked to him about Sight Machine and the implications of full technological immersion in our daily lives.
How long have you been working on Sight Machine?
It’s part of a larger body of work about machine vision that I’ve been working on for three years. We had already been working on a video installation of a quartet playing, so when the opportunity to do this performance came up, we decided we wanted to adapt the video into something live. The software that we’re using for the live performance is a modified version of the software we wrote to do the overall machine vision project.
Why did you want to do the performance at Pier 70?
The original invitation was to do a project at Pier 70. For me, one of the subtexts is that the Bay Area has a very long tradition of people throwing concerts and punk rock shows in warehouses. Unfortunately, that became public with the Ghost Ship fire in Oakland a few weeks ago. There’s a long tradition of this music culture in San Francisco. It’s a tradition that, in many ways, is being replaced by the technology industry. That spoke to me, because I got my start as an artist doing work in warehouses in the Bay Area. So doing this warehouse-style show resonated with me.
How would you describe what the viewer sees and experiences in the performance?
The musicians from the Kronos Quartet are performing, and there are different cameras set up to film them. Those cameras are attached to a suite of software that we developed, which allows us to take different machine vision algorithms used in everything from manufacturing to face detection for iPhones, to surveillance, to guided missiles. The cameras are attached to the software application, which allows us to use the different algorithms. Then the software makes an image that shows us what the algorithm thinks is interesting in a particular image or what it’s interpreting. That’s what you’re seeing projected behind the performers, as they’re being seen through this array of different algorithms through the course of the performance.
“Music is very affective. It’s speaks to your body as much as it speaks to your mind. It’s very human in that way.”
How would you describe the musical element, in terms of what you wanted the computer to interpret?
The Kronos Quartet is playing music from around the world. Some of the music has relationships to politics or to technology, for example, 17th-century or 18th-century technology. The music has relationships to technology on one end and politics on the other. One of the pieces they’re playing is George Crumb’s “God-Music,” which is a piece from the Vietnam War era. It’s a very powerful piece of protest music. Another example is [Steve Reich’s “America—Before the War” movement from Different Trains], which is all about a [Jewish American child's experience of excitement and romance for rail] travel during the late 1930s. The piece is about the exuberance around this form of technology, but then it’s also haunted by the Holocaust. In many cases, the music tells these stories in a way that are perhaps relevant to this new world of machine readability and technology that we find ourselves living in.
What did you learn about what computers find interesting about music?
It’s very quantified. This is one of the reasons why it’s a music performance, instead of something else. Music is very affective, in a way. It’s speaks to your body as much as it speaks to your mind. It’s very human in that way. This is something that a machinic interpretation of music cannot capture at all. It really underlines that contradiction between what is human and what is a quantified understanding of the world that machines produce.
How did you decide to work with both Obscura Digital and Kronos Quartet as partners in Sight Machine, and what made them good collaborators?
They were fantastic. I had the idea for the performance and thought, “Well, we need a string quartet.” Someone at Stanford asked, “Which quartet would you want to work with?” I told them that the Kronos Quartet would be the obvious, best choice, and that it would be a dream to work with them, but I thought they would definitely be too busy. Someone from Stanford called them up, and they said they’d love to participate. They’re dream collaborators for this project.
Pier 70 is actually going to be Obscura Digital’s new headquarters. They have a connection to the building. They have a very similar background as I do as an artist. Their founders had bands and punk rock shows in warehouses, and the company evolved from that, as well. It’s nice to work with old school, San Francisco artsy people on it.
How do you see Sight Machine developing during the residency and turning into a new body of work?
Sight Machine refers to this specific performance that I hope we’ll be able to do in different places. With the residency at Stanford, it’s really about getting into these questions and technologies more. What’s great about being able to work there is getting to work with the cutting edge laboratories and the people who are at the forefront of thinking about what the social and civil liberties implications of this stuff are.
Is there any specific takeaway or exchange you hope happens with the viewers of San Francisco and Silicon Valley?
There’s not a specific takeaway or a message, per se, as much as I hope to trouble some of the exuberance around these technologies, things like artificial intelligence and machine learning. Get people to perhaps think about some of the undesirable implications of mass surveillance might be, what might be lost as some of these technologies become more meshed into our daily lives.