Artificial intelligence can write music, but does it become an artist and creator from this? Now AI is changing the music industry, and the law will have to answer strange questions about authorship. Do AI algorithms create their own work, or is it the work of humans? What happens if, for example, an AI trained on the songs of a certain singer creates a track in her style?
For example, in American copyright laws, the word “person” never appears, and there were not so many litigations in the courts on this issue. As a result, AI in the field of copyright falls into some kind of gray, very ambiguous zone. Moreover, the laws also do not take into account the unique capabilities of AI – the ability to work without interruption and copy the style of specific performers. And depending on which way the law leans, AI could be a creative facilitator or an evil that takes jobs away from people—or both.
Even now, AI can copy the style of artists, and the law does not prevent it from doing so. The law does not require you to pay an artist a profit, unless you are using specific samples from already released songs.
Courts are reluctant to defend projects that work in one’s style, as musicians have influenced each other since the dawn of time. In order to have copyright issues, AI needs to create a song that will sound the same as some already existing one. If you purposefully teach AI on the songs of a particular artist, you can meet some troubles.
The people controlling the AI can violate the artist’s exclusive rights to create various works based on original material that has already been released.
Even if the system was able to accurately emulate the sound of the artist, the artist himself would spend a lot of time trying to prove that the AI was created precisely to copy this style of sound. Under copyright law, it is necessary to prove that the person accused of plagiarism was actually familiar with the original. And if AI is accused of plagiarism, then who can prove that the algorithm trained on the tracks of the accusing artist?
And in general it is not clear whether it is possible to teach AI on copyrighted music. When you buy a song, do you also get the rights to use it as AI training data? Some experts noted that there is no exact answer to this question yet.
Even if the system was able to accurately emulate the sound of the artist, the artist himself would spend a lot of time trying to prove that the AI was created precisely to copy this style of sound. Under copyright law, it is necessary to prove that the person accused of plagiarism was actually familiar with the original. And if AI is accused of plagiarism, then who can prove that the algorithm trained on the tracks of the accusing artist? Reverse engineering a neural network is not an easy task, and it is difficult to see what songs have been fed to it, because “it is just a configuration of digital values.
Another important issue in the case of using AI is the issue of authorship. Who is the author of such tracks – the algorithm itself or the people who wrote it? The decision seems absurd, but not giving people copyright when using AI can seriously reduce the usefulness of these algorithms in creative work. If you accept the result of the work of AI as a new form of creativity and at the same time take away the rights from the one who created this algorithm, you simply take away from him the desire to create. This is how music industry experts comment on this situation.