clock menu more-arrow no yes

Filed under:

BYU PhD student creates computer that composes music

When BYU PhD candidate Kristine Monteith was sitting in natural language processing class, it wasn't letter sequences going through her head but music notes. The class was talking about probability in language that helps with speech recognition of knowing what words would come up next. When she did the experiment, "It made me think, oh, you could do the same thing with music," Monteith said.

The Utah State graduate in music therapy pursuing her PhD in computer at BYU decided to apply her right and left brain abilities to combine music and computer science. She invented a computer program that can compose original music that evokes emotions humans can relate with, even though it was generated from a machine. She explained the program follows random notes and plugs in which notes would fit the best.

"Given these three notes, what note is most likely to follow it," Monteith said. "It's random generation. It's starting a new melody every time, but it's based on statistics."

After working on the project for only a short amount of time, Monteith had the chance to publish a paper about it for a conference and was forced to do a study within six weeks of the due date. In the survey she performed for that, she found that 54 percent of listeners could identify the emotions in computer-generated music, while for the human-composed music, only 43 percent could identify the emotions in the melodies. Monteith said she was just happy the computer-generated music could be set on the same level of the music composed by humans.

"One of the cool things was the fact that I put stuff together, and it worked so I could get published," Monteith said.

She used popular soundtracks from movies like Jurassic Park and Toy Story to teach the program to recognize emotions in music.

Her doctorate advisor Tony Martinez said he has enjoyed working with Monteith and has liked being able to see what the program can do with the research and work the two have put into the project.

"The most difficult part is getting the computer to do a real good job composing music which is as good as a human can compose," Martinez said.

He also said he has benefited from working with a student who involves artistic creativity along with the minute details of computer science.

"She's a hard worker," Martinez said. "She's passionate about music and computing and the way those things work together, and I think that's a big part of it."

After graduating from Utah State, Monteith took jobs at multiple places where she helped women with eating disorders, retirees or the severely depressed using music therapy to influence their emotions and behaviors.

Now that she is a mother of 6-month-old, Anna, she said she is excited to use her knowledge and experience with music composition to teach her daughter lessons like picking up her toys and having good manners through song.

With her program, she said she looks forward to applying it to generate those songs and help Anna feel different emotions.

"I think the songs that express love are the ones I liked the best," Monteith said. "The next step is to see how to perfect them and get songs I really like."

Associate computer science professor Dan Ventura helped Monteith along the way and said he likes what she has done. With his experience with artificial intelligence, he said he is excited about the advancements of creativity in computers from her project.

He said although he does not know exactly how she will improve upon the project, he said he assumes she will try to put more structure in her music.

"Right now she is producing music that doesn't have any overarching themes, no global structure," Ventura said. "It generates note-to-note with high probability put together that bring a pleasing musical sound. That's one way to improve things is to put that structure there."

When asked if the music is as effective as Bach or Beethoven at generating emotion, he answered, "You can't do that. Yet."