How an AI-assisted album is asking big questions about music’s uncertain future

patten, a London-based producer and artist, used a text-to-sound program to create the samples on his latest full-length album Mirage FM.

May 12, 2023
patten. Image provided by the artist.  

patten, the London-based producer and artist born Damien Roach, has used generative AI in his practice for years, but in late 2022 he came across something new. Riffusion is a website that uses Stable Diffusion, a neural network model that traditionally generates images from text prompts by trawling the internet for source material. Riffusion cleverly hijacks this by harnessing text prompts to create a spectrogram, a visual representation of a sound. Users type in whatever they like (“Puff Daddy polka,” perhaps? Or maybe “Interstellar soundtrack baile funk 2023?”) and save the resulting clip to their computer. After stumbling across the tool, patten spent a sleepless 36 hours generating a vast library of sounds with Riffusion, the raw material that he would spend the next several weeks pouring over and assembling into his new album Mirage FM.

ADVERTISEMENT

Riffusion’s outputs helped patten to expand his existing work with AI. His creative agency 555-5555 has created the visuals for the most recent Caribou tour and worked with a roster of musicians that includes Jayda G and Nathan Fake. Over the phone from London, patten tells me how big questions rather than novelty drew him to use generative AI for visual work. “[I want] to try and use these tools as a way to interrogate the way that we engage with images [and] how we actually engage with each other and the world and society… It's easy to forget how powerful images are in… forging out paths for what the future might look like.” For patten, Riffusion represented a new and novel dimension of AI’s approach to creating sound.

Mirage FM is, crucially, not AI-generated — this is not made with the same all-in-one technology as Boomy, an app that allows the user to generate full songs in seconds. What patten has done with Mirage FM is far more traditional: a play on the experimental collage genre of musique concrete using music that was created by a neural network. Described in a press statement as “crate digging in latent space,” Mirage FM, the album, is a fascinating collection of uncanny approximations with a washed-out, early YouTube-level sound quality. Modeled on a radio station playlist, the 21 tracks flit and flicker between styles like a more plunderphonics-minded J Dilla — “Walk With You” offers mutant UK garage, while tracks like “Alright” and “Where Does The Time Go?” are examples of the project’s interest in creating songs with generated permutations of R&B.

ADVERTISEMENT

I spoke with patten on April 26 in the shadow of a viral song created using generated vocals in the style of Drake and The Weeknd. It felt like a cataclysmic moment in the music industry; the developments that have emerged since make that view seem almost parochial. On May 1, a Nobel Prize-winning computer scientist known as the “godfather of AI” announced that he had quit his position at Google in order to better warn the public over unregulated advancements in generative technologies that could lead to an existential threat to human life. The following day, the Writer’s Guild of America (WGA) overwhelmingly voted in favor of a strike. One of its main contentions is that the Alliance of Motion Picture and Television Producers (AMPTP) refuses to engage with union over rules intended to protect WGA writers from the future prospect of being replaced by generative programs like ChatGPT.

patten is not blind to what’s happening. Concerns about AI’s advancement “obviously exist and they're very real,” he says. They are also woven into Mirage FM’s sound. “I was trying to make some music that did have all of those questions about authorship, about different ways of listening, about technology, about the present and the future,” he says. His approach to AI puts him in the camp of groundbreaking artists like Actress and Holly Herndon, who have incorporated advanced machine learning into their recorded output and beyond. “I wanted to present this optimistic sense of potential [of] what these tools could offer up in terms of possibilities,” patten says. He spoke with passion throughout our conversation, which covered Mirage FM’s creation, his intentions behind it, and his broader concerns with the new atomic age ushered in by AI. But he was also clear-eyed about the implications of a first-of-its-kind album like Mirage FM and how a hesitant public would receive it. “What you take away from it is very much dependent on your position,” he says, “whether this is a joyful moment or a moment to take pause and ask questions about what's right and what's ethical.”

ADVERTISEMENT

The FADER: Listening to Mirage FM helped me to turn a corner a little bit when it came to art that uses AI like Stable Diffusion. The images that I've seen that use Stable Diffusion have more often than not prompted a revulsion, especially if they are intentionally created in the style of another artist. Still, part of me wonders how my response to the album would have been affected had I gone in blind.

It's an interesting question, isn't it? How that knowledge affects the way that you listen to it. I think if anything, people give it a bit of a harder time. This is also one of the reasons why I thought it was important for me to even make this record: I think listening to it, it's very clear that it's a Patten record. That's quite important because in itself, just as an artifact, it makes lots of statements. One of them is the fact that, yes, I've used these tools, and I've generated all this material and these things that I've used to sample and produce new music from. But it still very much sounds like something that I would make under any other circumstances, albeit with specific characteristics that had come from the process.

ADVERTISEMENT

Was maintaining that sense of identity ever a challenge when you were making this record?

No, not really. I was just so engaged in doing it that it never felt like a chore or something. The process and the origin of the sound was something embedded in the music itself. But I thought an important part was this idea that, listening to it, it made it clear that anybody accessing these tools and deciding to use them could very much retain their identity, and their own ideas, and impulses, and aesthetics, whilst also engaging with artificial intelligence as part of the process.

ADVERTISEMENT

I've played around with Riffusion myself, and I was really struck with the narrow range of results that it was spitting back down to me. And it may have been to do with my prompts. For example, I typed in “death metal” and I got something that sounded like '90s alternative rock.

I'm really interested in language as a form, as material, and that's something that I've explored, certainly in my artwork under my own name. I've been using these text-to-image tools for some time, and learning about how they work, and trying to speak, trying to find ways to translate what it is that I am trying to see, or trying to explore, into a language that can be understood by these systems. When I bumped into Riffusion, I suppose I had a bit of a head start in terms of how to actually get interesting things out of such a system.

I think in a similar way to a piano or a guitar or a typewriter or a pencil and paper, these systems kind of await activation by an active human mind. And they don't sit there doing anything without anyone asking them to. And I think that in terms of looking at what kinds of prompts generate what kinds of outputs, I think one of the things that's so joyful about the present is sort of seeing how these tools are so differently activated by different people.

ADVERTISEMENT

I found in the process of actually kind of exploring what Riffusion could do, which resulted in the material that I used to create Mirage FM. It was really a combination of sometimes very, very specific directions and then sometimes really vague. In the text-to-image space, sometimes you see these kinds of very, very detailed prompts. Which are great, and I've used processes like that myself. But I think there's also a lot to be said for the weird zones that you can find and explore through quite vague descriptors as well.

I think listening to it, it's very clear that it's a Patten record. That’s quite important because in itself, just as an artifact, it makes lots of statements.
ADVERTISEMENT

Talk to me about how you approached the raw material.

With this particular project, it did open up this possibility where it seemed to make sense to think a bit like a crate digger, or crate digging in latent space. What I was trying to do in the way that I made these arrangements was, one, just makes some interesting music. And also to speak to this idea of a potential for different ways of making music and different ways of listening to music as well.

So it was a lot about formula structures, things that were in some ways reminiscent of forms and styles that are very familiar, but then presenting themselves as things that felt like they were coming into being or falling apart. A bit like that kind of point when a Polaroid photo is just starting to develop. And if you can imagine finding a way to just distend that moment of apparition of something coming into being and just staying there.

ADVERTISEMENT

Another example I'll give is, and I'm just thinking of this now, but if you turn on the radio or a stream or something like that, and right in the middle of a piece of music, for a second, it just sounds like chaos. You can't tell what it is that you are listening to. And then it will just slowly, or maybe sometimes quite quickly, just pop in and pop into something that makes sense.

I interviewed Actress back in 2017 and he said the exact same thing about musical moments that inspire him.

Well, there you go. It's great you bring up Actress, because Darren's work is super-interesting on this front. And also he has worked with AI in the past with his Young Paint alias.

ADVERTISEMENT

I don't think that most people, myself included, really appreciated the significance of that release.

I know. Yes, it's one of those weird things, the comprehension the audiences have about certain things. At least in the way that they're framed. Maybe it would've been different if that came out as an Actress record. Who knows?

This is also one of the reasons why I wanted to be really transparent about what was going on. I think in all of the information that's gone out there, it's been really upfront about exactly the tool that I've used. That's why also I came up with that term “crate digging in latent space.” Just so that people could have a way in to understanding what was going on there.

ADVERTISEMENT

Because I think it's hard if you're not super-techy or even interested in, because people who are into music aren't necessarily interested in how music is made. You can love the Beatles without caring about chord structures. You don't have to. Because the music just touches you.

ADVERTISEMENT

The idea of latent space, and this is probably entirely intentional on your part, it also extends to the emotional space of the songs. I'll listen to a song and I'll get glimpses of stuff that might feel a little bit nostalgic, but it's swirled and distorted and shrouded in this low-resolution sampling quality that makes it feel very off. It’s like you’re skydiving through an emotional landscape. You don’t really have anything to grab onto.

Yeah, I love that analogy. If you think about data points creating clouds of information, it's almost like falling through clouds of history, of feelings of memories, personal memories, but also larger cultural memories. But not in a way that's kind of purely nostalgic, but I think the uncanniness of it, it's both very alien and futuristic, but then also really familiar and homely at the same time.

It's really connected to an experience of being online that I think is contemporary. Everyone's internet drops out.

ADVERTISEMENT

I think this is also something that was solidified by everybody's experience over the Covid lockdowns or whatever, but the idea of a division between IRL and URL has been completely shattered. And I think it almost seems quaint to think of those as two different things now. This experience of multiple streams of data, of flicking between one thing and another, even the idea of the potent 15 seconds of sound that you get exposed to on TikTok, or walking past a store that's blaring music out, I think all of those things are sort of one space now. And I'd say that the invocation of being online is further than that. It's actually just an invocation of being in the present. Here in 2023. Which is characterized by this weird mutated experience of data, of information. And the way that information might resonate emotionally.

I’ve invested so much of my life in making things. And to see that changed so radically, so quickly, was quite strange.
ADVERTISEMENT

There are some well-founded fears about where generative tools could lead. For example, DSPs like Spotify could be deliberately saturated with trash just to bolster the market share of major labels and to stifle royalty payments. And within my industry, when an outlet announces layoffs, occasionally it comes with a statement from the owner that hints to generative tools being brought in that could replace the fired workers. Do these concerns ever give you pause?

I remember the first time using DALL-E, even after working with AI-assisted image generation systems for some years before that. The speed and ease with which it was possible to generate really compelling things using that tool was really scary. I've invested so much of my life in making things. And to see that changed so radically, so quickly, was quite strange. And I still find that something that I'm trying to figure out how I feel about it.

When I was making this record, there was a moment where I was like, “Well, is this bad? Should I actually be trying to find the Riffusion server and destroying it instead of making something with it?” I think that woven into the record is all of these questions. I don't know what's going to happen. And it is a concern. But I do think that it doesn't seem like it's going back in the box, and the conversation [should not be] dominated by platforms who do not care about ethics, who do not care about individuals, about justice, whatever it might be.

ADVERTISEMENT

I'm excited and really interested about how taking away all these barriers of entry, to communicating in different ways in images and sound or whatever, for as many people as possible. I think that's great. But also, as somebody who spent so much of my life dedicated to these things, it's an odd moment for sure.

It's an important time for artists and musicians and writers of different backgrounds, and all sorts of different ways of thinking about things to start to figure out in practical terms what it is that we want these tools to be for us and where they fit into things. And how new frameworks can be established to make a world in which this stuff can work for the greater good.

ADVERTISEMENT
How an AI-assisted album is asking big questions about music’s uncertain future