YACHT on using AI on new record Chain Tripping
LA-based trio YACHT talk about using machine learning technology to produce their latest album Chain Tripping, and why we still need humans in the loop
"We've always liked making work that has a deep end and a shallow end, and that's kind of our entire band," one third of YACHT, Claire L. Evans, tells us over the phone from her home in LA, which she shares with her partner and bandmate Jona Bechtolt.
"You can dip a toe in on the shallow end and it's just party music, then you swim all the way to the deep end and it's just so weird... there's 17 years of just the weirdest stuff for you to chew through, and that's our great joy." This statement couldn’t be more true of YACHT, a band who, over the years, have made a name for themselves for bringing together the worlds of music and technology, at times to polarising effect.
Using data to kickstart YACHT's new album
Over the course of their 17-year career, the trio, completed by longtime collaborator Rob Kieswetter, have delved into the worlds of perfumes, apparel, apps and sculptures, but for their seventh album, they delve deeper into the technological world than they ever have before. In order to create Chain Tripping, the band filtered their entire back catalogue of 82 songs through machine learning software – an application of artificial intelligence – and used the midi-data that was fed back to them to produce the album’s tracks.
"We had this massive volume of data that we were working with and then we were… picking through the volume and trying to find interesting moments," says Evans. "It's not something where you put information in and get information out, and then use it as is. We're not at a point in the technology where that is feasible or aesthetically interesting at all; there really have to be the humans in the loop."
Bechtolt adds: "So it's just like getting sheet music; we got all this sheet music back and then we had to decide what music went with which instrument, so if there was a bass line or a guitar line or a vocal melody. And then we performed those and recorded all of those performances – and that's how we got through the composition part of it."
Collaborations and creativity
Next they had the lyrical process to contend with. This involved a collaboration with creative technologist, hacker, data scientist and poet Ross Goodwin, who counts working at Google and as a former Obama administration ghostwriter among his many credentials. Goodwin built the band an algorithmic model trained on their back catalogue, as well as their musical influences, in order to "replicate the stew of influence and experience that we would normally dig deeper into," says Evans.
And the collaborations didn’t stop there either, with the band bringing on numerous creatives specialising in AI to assist on other elements of the album. Artist in residence at Google Arts & Culture Mario Klingemann was brought in to produce the band’s photographs, New Zealand-based AI artist Tom White made the album’s artwork, and they worked on various other elements, such as typography and music videos, with the likes of Allison Parrish, Barney McCann and Counterpoint’s Samuel Diggins and Tero Parviainen.
"Basically every component of an album that can have an interesting AI or machine learning component, we tried to identify our favourite creative practitioner in that field," says Evans. "I think that's kind of a really nice thing about art and technology is that it sort of requires collaboration between people… And that, for us, is also a big part of the process, and the project, and a big part of the joy of working in this way is not as much what it produces but the kind of connections and conversations that we can have."
Through the band’s extensive research and multiple collaborations, they were also made aware of the NSynth – an ongoing experiment by Google’s Magenta team, whose primary research revolves around exploring how machine learning can be used as a tool in the creative process. "When we first started playing around with it, we thought it was kind of a joke," says Evans. "And then we kind of fell in love with it because we realised that it was this sort of high-tech, lo-fi object and that is exactly who we are, and it's exactly what we're doing."
Bechtolt adds: "It was sold to us as this insanely complex process… under the hood it's really impressive, but – the output at first – we were like ‘shouldn't this sound more futuristic if so much money and time is going into it?’ But then we realised that we could bend it to our will and make it part of our arsenal."
"People and automated systems can work together..."
Unlike many other musical projects that have explored the use of AI technology recently though, Chain Tripping is perhaps the least likely to be described as experimental. There’s still the same distinct YACHT sound across the record, and these are still very much pop songs despite all the various complex technological elements that went into them. "We wanted to make songs that were undeniably YACHT songs; that sounded like us, but maybe a little bit off or a little bit weird," says Bechtolt.
Evans adds: "We're at this point in machine learning where technology is really mind-blowingly sophisticated and requires a huge amount of computing power but, at the same time, you can't just press a button and make a song; it's not possible yet… We're in an interesting moment right now where people and automated systems can work together to create things that are greater than the sum of their parts."
The world of AI and machine learning technology is undeniably overwhelming, and no matter how much information you absorb, there is always so much more to learn. But with albums like Chain Tripping using AI to produce art that is accessible and palatable to an average audience, you can dive in at the deep end and still manage to swim.