Article from DLRmagazine 171: An artificial intelligence transforms an orchestral piece using live audience reaction

AI adds feeling to the music

A musician and DLR researchers have developed an artificial intelligence that composes the audience's reactions live into an orchestral piece.
The AI takes the emotions and the pulse of the audience, plus information from questionnaires that the listeners have filled in beforehand. Then it rewrites the piece.

No, the piece was not a smash hit – that we need to admit. However, the orchestral work 'The Unanswered Question' by Charles Ives sounded different without a doubt in this unique musical experiment – all thanks to the audience and an artificial intelligence (AI) program that evaluated the listeners' reactions to the original music and used the results to rewrite the piece.

Art in science, science in art. Saarbrücken-based musician Martin Hennecke united art and science for his project: 'The (Un)Answered Question – a musical data science experiment'. He joined the DLR Institute for Software Technology on an interdisciplinary research grant from the Helmholtz Information & Data Science Academy (HIDA). For three months, he worked with the researchers on software, developed clusters and gathered data. The key question was whether an AI program could create a live orchestral remix. It turns out that it can, as proven by a trial performance in Dortmund in May. A concert is set to follow at Saarland State Theatre in November – in front of, and with the help of, a large audience.

Lyrics, music and software come together

The event in Dortmund is on a somewhat smaller scale. There is a workshop feel at the venue – the Academy for Theatre and Digitality – which is supporting the project alongside HIDA. Three DLR employees have opened their laptops and are making final adjustments to the software. There are 10 chairs for the audience, two cameras for facial recognition, nine music stands for the musicians and another for the conductor. The rehearsal begins with a prelude – the first part of the project. An actor recites the poem 'The Sphinx' by Ralph Waldo Emerson (1803–1882). On a screen, a large MRI image of the actor's beating heart, which was taken at the Max Delbrück Centre for Molecular Medicine in Berlin, another project partner. The theme of 'bigheartedness' is a common thread running through the project. The poem itself revolves around a story from Greek mythology, in which a sphinx is laying siege to the city of Thebes and killing anyone unable to solve its riddle. This story provided the inspiration for the piece 'The Unanswered Question' by Charles Ives (1874–1954). The US-American composer wrote long notes for a string quartet. Several times, a trumpet plays a refrain that sounds like a question. Four woodwind instruments play completely different melodies. During his lifetime, Ives was known for his adventurous compositions. True to form, this piece features certain dissonances – there are moments when it sounds a bit 'odd'.

The test audience listens to the music for six minutes. Although the music stands are set up as they would be for a real performance, the piece is not being played live; a video of a symphony orchestra is being shown. The listeners in Dortmund are wearing wristbands that record their heart rate. When the flutes start up, some pulses rise. The listeners filled out an anonymous online questionnaire beforehand, which included statements such as 'I'm empathetic and kind-hearted', 'I don't have much compassion for others', 'I'm helpful and selfless', 'I tend to be indifferent to others'. For each statement there were five possible answers, ranging from 'Totally disagree' to 'Totally agree'. This allows the AI program to make a more effective assessment of the audience. The facial recognition software distinguishes between happiness, sadness, anger, neutrality, disgust, surprise and fear. These emotions will form the basis for the AI's edits. How do people feel about the piece? The answer is apparent on the screen a little later: lots of little dots bounce back and forth between neutral, angry, sad and happy, while the heart image in the middle keeps on beating.

"Even if the audience is not familiar with data science, we can use the music and the visualisations to convey what is happening with the data and through the data."

Carina Haupt, Mitarbeiter im DLR Institute for Software Technology

Then the AI program gets to work. It gauges the audience's emotions and heart rate, assesses the information from the questionnaires and rewrites the piece. The form that this will take is impossible to predict. "I'm curious to hear how it will sound," says Martin Hennecke. The AI program does not work in a linear way. "If someone is feeling sad, the piece does not automatically become more cheerful." The algorithm, which the DLR researchers developed together with Hennecke, clusters the data to summarise them, evaluates them and then triggers changes to the remix.

The new piece is ready to play immediately. During the trial run in Dortmund, it is performed purely electronically. The rhythm has changed, and very long notes have become shorter. The harmonies are more pleasing to the ear and follow different sequences. During the performance in Saarbrücken, the musicians will play the remixed version directly from electronic tablets in the hall. For them, 'The Unanswered Question' will be entirely transformed once again. "The way in which the orchestra and the audience interact makes this a special artistic experience," says Hennecke. "Using digital tools and techniques from the field of data science, the experience becomes even more vivid and human, interestingly despite the fact that it is achieved using technology and computers."

Rehearsal performance of 'The (Un)Answered Question – a musical data science experiment'.
The audience's reactions are immediately recognised and displayed by the artificial intelligence.

Visualisations show the audience how researchers work

The researchers at the DLR Institute for Software Technology see the overall project as something very special. It comprises the recitation of a poem, data collection during the playing of a classical piece, and ultimately a remixed work. "Even if the audience is not familiar with data science, we can use the music and the visualisations to convey what is happening with the data and through the data," says Carina Haupt of the Institute for Software Technology. Together with Andreas Schreiber, she is responsible for the project at DLR. Among other things, the Intelligent and Distributed Systems department uses visualisations to display complex software architectures. "We are using a similar approach here. We are working with images and metaphors to combine and represent heart rate and emotions," adds Schreiber.

There is no break for Martin Hennecke following the November event at the Saarland State Theatre, where he also performs as a timpanist and percussionist. His next project will focus on ballet: he is writing a score that can be changed in real time via the processing of personal data online. This data will come from the members of the Saarland State Ballet and, of course, the audience.

An article by Katja Lenz from the DLRmagazine 171

Contacts

Katja Lenz

Editor
German Aerospace Center (DLR)
Corporate Communications
Linder Höhe, 51147 Cologne
Tel: +49 2203 601-5401

Julia Heil

Editorial management DLRmagazine
German Aerospace Center (DLR)
Communications and Media Relations
Linder Höhe, 51147 Cologne