Making Music with AI: An Introduction

by qthomasbower License: CC by 2.0

Most composers and artists today write music using a digital language that facilitates saving, making changes and sharing song information easily. The most widely used format is called Musical Instrument Data Interface (MIDI) and has been the standard in music composition since it was introduced almost 30 years ago. Scientists and artists can use MIDI information to teach computers to make new music from existing songs and musical "rules" for composition.

A recent example of this is "Folk-RNN". Researchers entered thousands of transcribed examples of Celtic Folk music into a deep-learning system that learned from the MIDI song information to create new melodies. The researchers were surprised to find that the system could cook up "authentic" sounding melodies once every 5 times it tried.

Researcher Dr Oded Ben Tal, who is also a musician and used some of those melodies to make new original music, pointed out that technique could also help musicians make more compelling material:

You can feed a system a sequence of notes, and it can produce some other sequences of notes that are kind of interesting, but that’s not music yet...
When you actually invite people to think about these notes, to work their stuff through them, then I think it can become a bit more interesting.
— Dr Ben-Tal, "An A.I. in London is Writing Its Own Music and It Sounds Heavenly", Mike Brown, 2017

Another way AI is being used to make music is by teaching a computer to listen to actual audio clips and develop new music based on what it heard. These techniques can help to recreate the nuances of musical performance (like timbre, dynamic changes over time) but are significantly more challenging to achieve because so much information is required to represent music faithfully. 

Scientists at Google Magenta developed a special synthesizer, called N-Synth, that adapts this technique to help make new digital musical instruments.

The system uses a deep learning algorithm to learn a low-dimensional representation of musical notes and their characteristics. This allowed researchers to combine sounds that were previously unrelated, such as the sound of a trombone playing and a door slamming, to make completely new instruments.
 

The Sync Project is harnessing the insights of both artificial intelligence and real musicians to make compelling, original music that is intended to help you with key health and wellness goals.

We developed http://unwind.ai with acclaimed musicians Marconi Union to make an original piece of music that listens to your heart rate and is designed to help you relax before sleep. Try it with your smartphone today and watch this space as we delve deeper into this promising world of music, AI, and health.