The convergence between music production and emerging technologies has reached a tipping point around what it means to be creative, and what role humans will play in the creation of art in general. Smart technologies in music are now capable of mimicking human creativity by composing music without any human supervision. Many believe that emerging technologies such as artificial intelligence will eventually replace humans altogether due to the automation of both technical processes and creativity. This notion of machines matching or surpassing human creativity raises several questions about what role humans will play in future art- if one exists for us at all.
As automated technologies begin to be integrated into music creation (and into virtually every field that has become digitized), there are signs that people will be able to work with these technologies in new and incredible ways. Not only do automated technologies help in making the production process more efficient, they also enhance a musician’s capabilities in ways that we haven’t seen before. Today’s music producer has better connectivity with audiences and better access to tools and other musicians than ever before. New technologies are empowering the music producer to push creative boundaries and form new relationships with themselves, their work, and their stakeholders. With advances in artificial intelligence and machine learning, these boundaries will inevitably change what it means to be a music creator. Under this perspective, our roles as creators will continue to evolve rather than diminish as we integrate so-called “smart” technologies into our work. The question comes down to how well we can learn and re-skill in order to remain relevant so that we continue to create meaningful musical experiences for the world.
Some of the greatest benefits to modern musical composition are offered by the automating capabilities of artificial intelligence and related smart technologies within software. While artificial intelligence is defined as the mimicry of human behaviour or intelligence performed by intelligent machines, automation simply refers to hardware or software that is capable of doing things automatically. Automation has been applied in a multitude of fields and industries, particularly in music composition and production.
While many segments of the artistic population still shy away from integrating smart technology into their music creation and expression, those who have embraced it have seen incredible jumps in their productivity. For example, artists who have traditionally paid audio-engineers thousands of dollars to mix and master their work are now using digital mastering-on-demand services offered by technology startups at a fraction of the price of an audio-engineer. As a result, professional grade music is no longer only accessible by the wealthy or elite. Club DJs who used to lug around their CDs or vinyl records can now plug into the club’s audio equipment and then mix off their mobile phone using Spotify and a DJ app. The requirement of having thousands of dollars invested into top notch DJ and production equipment has now been replaced with a mere $10/month mobile phone application. These examples are just the tip of the iceberg in terms of future capabilities…
One of the most significant examples of increased efficiency in music creation and composition is the emergence of DAWs–Digital Audio Workstations, software used for the recording and production of a variety of audio files. These software platforms have successfully lowered the barrier to entry into the music creation process as they are scalable, accessible, and affordable. In order to stay competitive in their respective niches, DAWs such as Ableton, Logic, and Pro Tools have been adding better tech features, plugins, and custom features to both their hardware and software platforms. These automated features have vastly improved the process of digital music creation and have given artists a plethora of options when it comes to pre-production, post-production, and live performance.
Examples like this help prove the benefits of automation in music; it outsources the tedious components of a given task in order to save the creator time and energy. Historically, automation has enabled us to find new and creative ways to work while improving our levels of efficiency in order to produce better results. Although some individuals have reservations when considering the merger of artificial intelligence and music, it is difficult to argue against the value created by automation. The proffered features are similar to those of a calculator; it is specific programming, by the user, to perform clearly defined tasks. In a live performance, this removes the manual input required to carry out that function. Although this is not a new feature, the next wave of automation foreshadows even more exciting use cases for producers and listeners.
In addition to the automated support provided by algorithms, musicians and producers have also enjoyed the positive benefits that emerging technologies have brought in terms of increased connectivity and collaboration. For example, DAWs are now moving from the computer desktop to the cloud and are thereby enabling borderless connectivity, collaboration, and creation.
Outro is a Montreal-based startup whose mission is to form new music relationships with its music community.. Outro’s cloud-based platform has all of the traditional DAW features, and then some. It is powered by a dynamic community, and smart matchmaking features that helps its users connect with other musicians. By facilitating the connection of people and sounds, it allows musicians to focus on the fun part of their job–exploring and creating music. Outro’s matchmaking feature is simple: begin by uploading your music sample, and the platform will analyze and match it with other compatible instruments and samples using certain classification methods.
“The technology we develop is built to facilitate relationships and help our users reach their potential” says CEO Mark Vesprini, “The music industry will always be driven by human interactions, perceptions, and emotions.”
Another example of this technology is the company Ohm Force–manufacturer of Ohm Studio, the world’s first real-time collaborative digital audio workstation. Ohm Force calls itself the ‘Google Docs for audio editing’, allowing multiple collaborators to engage through the cloud and the Ohm Studio workstation to work on musical projects together regardless of location. In addition to the increased efficiency users can gain from its DAW capabilities, the platform acts as a social network–using technology to break down the barriers of time and space in order to increase collaboration and connectivity.
Ohm Force hopes to facilitate real-time collaboration
What makes platforms like Outro exciting for both music veterans and enthusiasts, is the method with which it applies AI to connect musicians internationally. For example, someone in rural Brazil can get connected online with a likeminded Montreal musician, and together they can tinker with an online music project in real time. This wouldn’t be possible without the matchmaking features that are powered by Outro’s AI. In today’s hyper-connected world, forming new music relationships through whatever medium is available will continue the tradition of creating a unique storytelling experience. In other words, the shared human experience is amplified by the growth and application of smart technologies such as this kind of artificial intelligence.
Although technology and artificial intelligence have increased efficiency, connectivity and collaboration in the music community, one of the most impressive results of such integrations has been the growing capabilities of AI to create original music with or without the help of a master creator. In terms of music production, automation has made itself relevant at various stages of composition; we have seen examples of automated elements within both pre- and post-production, and even automated aspects of live performances that combine visual and audible elements. In the last couple of years, we have even seen AI take on the role of music creator and orchestrator.
Examples like the following one, which is actually coming out of Montreal:
and could also bring in the example of automated versions of AI and algorithms simply spitting out music.
Even though these are impressive examples, there have still been doubters in the space. Some claim that artificial intelligence isn’t able to be creative at all; it can only regurgitate elements already created by other musicians or instruments. This following example from David Cope–researcher of musical intelligence–looks to potentially prove otherwise.
While pioneers like Cope have been programming intelligent machines to compose music that either matches or surpasses the abilities of some of the best music composers, there are indeed limits to the supervised algorithms that he has built. For example, a supervised algorithm requires step-by-step instructions and constant training from both data and humans in order to achieve its respective goal. Once these “expert system” algorithms go off script, they often lose their effectiveness in achieving general intelligence. However, the rise of unsupervised learning algorithms are now challenging this notion about how machines learn and apply intuition, and it will have a dramatic impact on every industry including music.
Bach-style Chorale by Experiments in Musical Intelligence computer program created by David Cope
This subfield of artificial intelligence is called deep learning. Using these deep learning techniques, machines parse large data sets and learn without human intervention. They essentially train and learn by themselves. If deep learning algorithms can learn on their own, they may learn how to be creative as well. This will inevitably change the way that musicians create and interact with others. For example, one such algorithm can parse the interests of a musician and his or her community, and then suggest what kind of music to create next based on this information. These algorithms can applied in live music performances to read the crowd’s emotions using heat-mapping techniques, and make suggestions to the performer on what song to play next or what musical key would infuse a specific outcome.
New technology continues to have an evolving role in how we create and consume music. Forms of this technology–namely, automated systems and AI–have been used by musicians and enthusiasts to improve inefficiencies in the creation process, increase the connectivity and collaboration of the global music scene, and finally to leverage these programs and technologies to create new music forms. Where will this go? It will be interesting to see.
Regardless of what new and exciting AI technologies emerge in music, the only way that it will remain in use by musicians is if it enhances the creator’s role of providing an emotional, entertaining, and/or entirely new experience for his or her audience. In other words, musicians and entertainers will have to continuously push the boundaries of technology in order to remain fresh or relevant in and out of the studio. Even though breakthroughs in deep learning can open new ways of working, they may not always be accepted even by the earliest adopters.
In today’s globally connected world, the convergence of artificial intelligence, cloud computing, Internet of Things (IOT), and virtual and augmented reality platforms will reshape our identities as we experience the world through sound and video. Eventually, AI has the potential to replace music creators, however, if its interaction with music creation today is any indication, we can simply expect new forms of creation and collaboration, not complete and utter replacement. The only thing that remains certain about the future of experiencing sound is that the experience will feel more real, connected, and immersive than ever before.
XCap Insights is a Montreal based intelligence community that educates and connects individuals, initiatives, and organizations to the world of Emerging Technologies. You can find them at: www.xcapinsights.com or on Facebook