Stepping into the San Francisco Symphony Soundbox on April 6, we were greeted by a number of different robots performing different skills from painting on canvases to a robot dog doing tricks. The program itself consisted of a combination of musical scores accompanied by AI components. Hosting the program was Carol Reiley, a roboticist in the Bay Area who explores how AI can be used in the production of music.
Artificial Intelligence was first introduced in the 1950s, defined as the “computational part of the ability to achieve goals in the world.” It was presented with the intent to create machines that could utilize language and form concepts to think and solve problems for themselves, just like humans do.
Today we are seeing an increased use of AI in different ways, from fully autonomous vehicles driving the streets of major cities, generating images of anything you could imagine and even creating music, AI has come a long way
“My goal is never to replace what humans can do better,” said Reiley. “My work was not necessarily interested in exploring music that robots were conducting or robots playing instruments. I was really interested in different ways that technology could showcase a piece and have the audience experience it in a new light or new perspective.”
AI Music isn’t just for the symphony. You might just hear it inside a doggy daycare. That’s right, some people like Caleb Phillips are experimenting with AI to help make songs to calm down anxious dogs. Music is played through a collar, monitoring the dog’s heart rate to determine what music the dog likes and dislikes.
“It takes down all the music, and what notes and pitches that the dog likes, and records it,” said Phillips. “Then it goes through Spotify and says, ‘Okay, this is what that note is,’ then it records that and then it puts it together and makes music.“
Using AI in the production of music has sparked some ethical concerns.
“I think music or art, whatever is created is based on an individual’s experience that I don’t know if you can necessarily just input into a machine to get that human experience of things to reflect on and influence the work that you do,” said Norio Fujikawa who was waiting in line for the Soundbox program.
The future of using AI in the production of music is still up in the air, but there will be an increased use of AI music in the music industry in the future.
“I hope that we as people can control the destiny of AI,” said Fujikawa.
Podcast Transcript
GM: I love music, it’s been a part of my life for as long as I can remember.
SW: Me too, growing up listening to music was my favorite thing to do with my dad. We listened to all kinds of different genres and would bond over how the music made us feel.
SW: I’m Sydney Williams
GM: and I’m Giovanna Montoya, and this is
BOTH: an Artificial Future
GM: I feel like music is a universal language. I started playing piano when I was 6 and I love how music sort of caters to my emotions. I haven’t had the opportunity to play much in the last couple of years, but being physically involved with the music is one of my favorite aspects, and that is changing.
SW: That’s right. – Artificial Intelligence is becoming more prominent in the music industry.
GM: I have seen so many interesting ways in which AI has been a tool in the music industry, like how AI technology was used to enhance the original audio of a Beatles track
SW: or how Grime’s is allowing people to produce music using an AI-generated version of her voice.
GM: Using AI in the production of music may have its benefits, but there have been some ethical concerns. Some have raised the concern that using AI-generated tools produces a less creative and less genuine track compared to music produced solely by humans.
SW: There are other worries that AI-generated music will take away jobs from musicians and composers.
GM: We wanted to learn more about AI and music. So where did we go?
BOTH: The San Francisco Symphony!
SW: We are at the San Francisco Symphony on the walk over to find the entrance for Kevin Riley and the robots. But we’re not completely sure where we’re going. We’re just going to circle the symphony. Oh, I found it.
GM: Oh my gosh.
SW: To get the inside scoop, we decided to talk to some of the people lined up outside waiting to get in.
GM: Larry Horvath, Cynthia Douglas and Norio Fujikawa shared their understandings of the event, as well their as thoughts and opinions about AI in art and music.
Larry Horvath (2:55): So she’s both a musician and totally into AI. The people in the symphony love this because they get to do like avant-garde music. So she wants to do this really interactive. There’s a bar you sit down, the musicians get in there with the audience. But it’s fun.
Cynthia Douglas (2:13): Music for me is a very personal thing. And AI is a tool. It’s like an instrument.
Norio Fujikawa (4:02): I think again, it’s a tool and almost maybe becomes another art form of itself. Perhaps I don’t think it’ll take the place of what people create. But it may open up new ways of creating different kinds of art forms are content. And that’s what artists always do is always looking at new ways to create, whether it’s an expression a point of view, a new medium, so there I think there’s potential for it to be something else.
GM: Originally, Sydney and I were not planning on actually attending the event,
SW: College students don’t necessarily have the funds for Symphony tickets.
GM: However, we couldn’t let that stop us.
SW: Outside of the symphony, we ran into Scott Pingel, Principal Bass of the San Francisco Symphony who was performing at the event.
Scott Pingel: Can you get in? Do they have tickets for you?
GM: No, they don’t
SP: Don’t go anywhere. I have two tickets. These are student journalists. So they want to come see the show.
Unknown: You guys are- you’ll love it, come see it.
GM: Once inside the sound box theater, we were greeted by a number of different robots performing different skills from painting on canvases to robot dogs doing tricks.
Audio from the opening of the program at the symphony: Welcome to Soundbox, please take a moment to locate your nearest emergency exits, which may be here, here, here or here.
GM: The program itself consisted of a combination of musical scores accompanied by AI components and highlighted a musician performing Beethoven’s “Für Elise” with an AI-generated violin line.
Carol Reiley: The name classical music makes it feel stuffy or like so static. It’s a fresh perspective, especially with AI which is so futuristic.
SW: Carol Reiley is a roboticist working with artificial intelligence and music in the Bay Area. She has been given the nickname “Mother of Robots,” and has been named one of the “World’s 50 most renowned women in robotics”.
CR: I have looked to explore different ways that technology can enhance creativity in the arts. And I, my goal is never to replace what humans can do better. My work was not necessarily interested in exploring music that robots were conducting or robots or playing instruments. But I was really interested in different ways that technology could showcase a piece and have the audience experience it in a new light or new perspective.
Carol Reiley at the symphony: So this is a piece that we’re about to do next is a from duet for heart and breath. Musicians are conducted by my heartbeat, but you’re gonna hear and utilize their breath instruments. There’s also a brave student volunteer here that’s connected to an emotive EEG headset. And her brain signals are going to be shown on the projectors in real time as she listens to the music during this live performance.
GM: In 2021, Riley and her colleague co composed a short AI assisted piece with the help of a program created by a friend at open AI. The score was performed by the San Francisco Symphony, and the album was Grammy nominated.
CR: I think it was also nice to push the musicians in ways that they’re not used to because, I mean, they’re amazing. I’m not like a composer. I didn’t have like specific vision and I didn’t have a certain way I wanted it played. So I had them like improv. And I was like, Are you comfortable? And so I was taking them like out of their comfort zone- Asking pushing them to go out of their own comfort zone.
GM: I feel like in many ways, this recent surge of AI is sort of pushing us all out of our comfort zone.
SW: I mean, there’s so many movies, books and television shows about how robots are going to rise up against humans
GM: The Terminator was set to travel back in time from 2029. That’s literally five years from now.
SW: Okay, I think we’re getting a little ahead of ourselves. AI is far from taking over the world.
GM: For now, AI is being used to explore creativity. Well, and help dogs with anxiety.
Caleb Phillips: I consult for a company called Barklei, and basically what they are is like a like Fitbit for dogs.
SW: That’s Caleb Phillips. He works with AI and music. We met him during the show at the symphony.
GM: If you feel like this story has taken a turn, that’s because it sort of has. artificially generated music therapy for dogs is yet another one of the many ways in which the tech and music industries are being integrated.
CP: A lot of the technology behind that is AI-generated because like it plays music to help calm down the dog’s anxiety through the collar. But it doesn’t play like any music. It’s like aI generated music, it takes down all the music, and what the like, what notes and pitches and stuff that the dog likes, it records it. And then it goes through Spotify and says, Okay, this is what that note is. And then it records that and then it puts it together and makes music.
GM: Caleb has been working with AI and technology for a while and happens to be a huge music lover himself.
SW: He also wonders what the future will look like with the use of AI and music.
CP: Music and art, the merge. It hasn’t really happened. But like the idea of music, where music is going with, you know, tech music, and now you can make music on your computer and all that stuff. And then the power of AI. So I think it’s they’re merging like now. So I don’t think it’s changed yet. But I do think it will.
GM: Although many are concerned about the progression of AI and what that means for the future of jobs and genuine creativity. He looks at things a little differently.
CP: I think it just depends on the perspective that you’re looking at it. If you look at it as like, oh, yeah, this is like a new way to like add a new genre of music. And yeah, I think it’s good. If you think of it like, Oh, now this is going to be like replace music. And now artists aren’t going to be as special then No, I don’t think it’d be good. But I don’t think that’s the case. In my opinion.
SW: According to Carol Reiley, we are only going to see more and more of a partnership between the tech and music industries, but the future of AI and music is still up in the air.
CR: You know, AI is so disruptive and so fast. And when it’s placed in the hands of a few people who can revolutionize it. It’s a heavy responsibility. And I do think that I’m very curious to tell like positive stories about like how tech can enhance our creativity. But I also realized on the other side that it will be very disruptive for the future of work and the way that things are currently done. And we will go through like this big change.
GM: From what we’ve learned. Maybe we don’t have to be so scared of the changes in progression we will see in the future use of AI and music.
SW: Yeah, but I can’t promise that we don’t have to worry about Arnold Schwarzenegger coming to get us
GM: Maybe in the future.
BOTH: Hasta la vista, baby!