idalab seminar #18: Generating music in realtime with Artificial Intelligence: What if music could change automatically with the emotional state in a video game?

In a time when creators of video games are trying to make gaming an ever more immersive and ever more real experience, AI is beginning to have an impact in the field. Everyone who’s ever played a video game knows: Music plays an important part in conveying the atmosphere of altering game states and thus needs to change dynamically as the gamer makes choices and variations of the storyline unfold. However, as storylines get ever more complex and the number of possible choices practically explodes, the task of implementing changes in music into a game’s engine, while at the same time ensuring smooth transitions, eventually becomes too complex to handle manually. The task clearly calls for some form of automatisation – but does AI have the ability to create effective video game music?

Using AI to generate adaptive video game music is not entirely new. A wealth of research has been carried out both in academia and the gaming industry. Deep Adaptive Music (DAM), an innovative solution developed by Melodrive, extends adaptive music by more intensely leveraging AI. DAM allows for the generation of music in realtime by an AI system running directly inside an interactive experience, adapting to emotional states within the game on a granular level. Responding to both the user’s interaction and to the changing game states, DAM complements the idea of co-creation between the AI and a human composer or player. In my talk, I will first give an overview on past research in the field, then dive into the mechanics behind DAM and eventually consider AI’s potentiality in creative tasks.

Thursday, April 4th, 7 pm | doors open at 6.30 pm | Potsdamer Straße 68, 10785 Berlin

It will also be possible to watch the event live via Vimeo. The link to the event will also be shared via Twitter and Linkedin.

You can also find the event on Meetup and join the idalab seminar Meetup-Group.

Slides to Valerio’s presentation can be found on Slideshare.

About idalab seminars: idalab seminars are open to all interested parties. Once a month, we invite scholars, data scientists, business experts and big data thought leaders to discuss their work, gain new perspectives and generate fresh insights.

After the talk, we invite you to stay for drinks. We’re looking forward to seeing you there!

Valerio Velardo is CEO and co-founder of Melodrive, a startup company that is developing an AI music system that generates music automatically for interactive experiences, such as video games and VR/AR. He has a PhD in music and AI from the University of Huddersfield, UK. Valerio has spent the last ten years teaching computers how to create music autonomously.  In his research, he has developed an AI multi-agent system that is capable of simulating a society of virtual songwriters that produce rock songs and develop their own musical style. Valerio has also studied astrophysics at degree level and, in another life, has worked as pianist, composer and conductor.