Ever heard of Taryn Southern? You might not be familiar with the YouTuber and American Idol contestant unless you’re a dedicated viewer of the show or frequent her channel on YouTube. The pop artist hit the headlines in 2017 when she created a moody ballad song Break Free produced by AI. Eventually, Southern created an entire album titled I AM AI which is the first LP to be entirely composed and produced using AI.

Using artificial intelligence as a tool to make music or as an aid for musicians is not entirely novel since it has been in practice for quite some time. While the use of artificial intelligence is increasing in various sectors, its reach in creative industries such as music hasn’t been deemed as attention worthy. This is because music making and production have always been viewed as areas that require human creative impulses and interventions. However, AI-based technologies are slowly making inroads in musical composition, mixing-assistance, as well as audio-finalization in mastering.

verbasizer

Verbasizer

Beginning in the ‘90s, artificial intelligence made its presence felt when David Bowie helped develop an app called the Verbasizer, which took literary source material and reordered the words at random to create new combinations that could be used as lyrics. Pop’s chief theoretician, Brian Eno, used it not only to create new endlessly perpetuating music on his album Reflection but also to render an entirely visual experience in 2016’s The Ship. The arrangements on Mexican composer Ivan Paz’s album Visions of Space were done by algorithms he created himself. Most recently, producer Baauer made Hate Me with Lil Miquela, an artificial digital Instagram avatar. Researchers at Sony used software called Flow Machines to create a melody in the style of The Beatles in 2016. This material was then turned over to composer Benoît Carré who developed a pop song called “Daddy’s Car.”(Flow Machines was also used to help create an entire album’s worth of music under the name SKYGGE, Danish for “shadow”). AI technology is also integrated with popular music-making programs such as Logic, a software that is used by musicians around the world that can auto-populate unique drum patterns. Currently, there’s an entire industry built around AI services for creating music, including Flow Machines, IBM Watson Beat, Google Magenta’s NSynth Super, Jukedeck, Melodrive, Spotify’s Creator Technology Research Lab, and Amper Music.

While AI hasn’t yet reached the sophistication where a completely AI-produced and created single has hit the charts, it is at the stage where it uses algorithms to create, perform and even monetize musical compositions. AI can be used to make certain aspects of audio engineering such as mastering dispensable, however, it still lacks the creative human touch and so cannot make the whole process AI-driven. The applications for AI-driven products currently include musical composition, mixing-assistance, and audio-finalization in mastering. Typically, audio mastering requires a room with specialized acoustics so that the engineer can hear and correct flaws in the music, such as issues in the spectral range or the stereo balance, and remove glitches, pops and crackles. An AI-based intervention in this regard is LANDR, an AI-based mastering service that is an inexpensive alternative to traditional mastering and used by younger and relatively new artists. It has had over 2 million musicians using its music creation platform to master 10 million songs.
When it comes to making music, though there have been many well-known musicians who have advocated against the use of AI in the process, it is interesting to follow the developments in this sector. For music making, AI systems use deep learning networks, a type of AI that’s reliant on analyzing large amounts of data. Basically, you feed huge amounts of source material into the software of various types, from dance hits to disco classics, which it then analyses to find patterns. After using aspects such as chords, tempo, length, and understanding how notes relate to one another, the AI can then write its own melodies. There are some platforms that deliver MIDI while others deliver audio and there are some that learn by examining data, while others rely on hard-coded rules based on musical theory to guide their output. For instance, Amper’s product allows musicians to create and download “stems”, unique portions of a track like a guitar riff or a hi-hat cymbal pattern, and rework them.

The founders at Amper consider AI in music making to be akin to rap producers switching from magnetic tape cutting to make beats to using Pro Tools. In this context, AI is considered to make the process more efficient and productive in the pursuit of creativity. Besides, using AI has the added benefit of allowing musicians with no formal music background to participate in making music. In a manner of speaking, it levels the playing field so anyone can play what they hear in their head.

There are other considerations when it comes to using AI. For artists and their reps, money is a key factor, whether it is for production costs or copyright and royalties. Taryn Southern, for example, shares writing credits on her album with Amper. But the software allowed her to use funds that would have been conventionally spent on human songwriters, session musicians, and studio time for a management team, publicists, and videographers—other components seen as being essential for the modern professional musician. Nevertheless, AI can’t replicate the innate talent of songwriters or the complex recording and production processes used to create the best-known works of the most celebrated musicians yet.

While AI applications and machine learning can be seen as opportunities, there is still a lot of research needed when it comes to AI in production, the area in which it is the weakest. The recording studio and the process of musical arrangements and mixing are all very creative processes that require a human touch even now. While the creative takeover by AI has not yet occurred, there might be a tipping point some years down the line.