The National Music Publishers’ Association (NMPA) has released a letter — written to U.S. Senate Majority Leader Chuck Schumer (D-NY), on April 24 — urging him to “consider the impact of AI technologies on the creative industries in the U.S.”
“If left unchecked,” NMPA president and CEO David Israelite writes, AI will “pose a threat to human creativity.” As he continues, Israelite notes that many AI models learn to generate convincing musical works by training on a wide swath of copyrighted material by artists and songwriters that the NMPA’s membership represents, often reaching into the millions or billions of pieces of training material. He asks for “complete records” of how these models are trained to be available to all and for Schumer’s efforts to include “clear standards” for tracking which inputs are used in order to produce these AI generated works.
“Without visibility into which copyrighted works are used — and how the AI systems use them — rightsholders will have no meaningful copyright protection with respect to AI technology,” he explains.
The NMPA’s letter was written in response to Schumer’s announcement on April 13, stating that he would launch a “major effort to get ahead of artificial intelligence.”
“Given the AI industry’s consequential and fast moving impact on society, national security, and the global economy, I’ve worked with some of the leading AI practitioners and thought leaders to create a framework that outlines a new regulatory regime that would prevent potentially catastrophic damage to our country while simultaneously making sure the U.S. advances and leads in this transformative technology,” Schumer said
The NMPA urges Schumer to regulate AI companies by requiring them to “seek permission and licenses from copyright owners” for the use of their music as training data and allow publishers and songwriters the ability to license the use of their songs for the development of AI models on the free market. “Rightsholders must also retain exclusive control over how and with what technologies their works are used,” he says.
In recent months, rightsholders have become more focused than ever on regulating not just what is produced by AI models, but also what is needed to help the models learn how to generate such works in the first place. Lawsuits, like that of visual artists Sarah Andersen, Kelly McKernan and Karla Ortiz filed against text-to-image generation firm Stability AI, detail what many copyright owners and creatives hope to establish as ground rules for AI companies in the future. In interviews, Andersen has nicknamed their hopes for regulation as “the three C’s”: consent, compensation, and credit.
So far, the issue of whether or not AI companies need to obtain consent and/or provide compensation and credit to use certain copyrighted works as training fodder for their models is not well established around the world, including the U.S. Lawmakers like Schumer face difficulty in regulating this burgeoning industry’s machine learning practices — both because of AI’s rapid development and because of today’s competitive, globalized marketplace.
While places like the European Union seem to be trying to create a stronger framework for protecting copyrighted material for use as training data, other countries like Singapore, Israel, China, Japan and South Korea are more friendly and lax with AI-related policies, potentially providing AI with safe havens for training abroad and undermining the regulations set forth by more strict, copyright-conscious countries. So far in the U.S., many lawyers point to “fair use” as a potential defense for AI companies that are already using copyright material without permission or remuneration, but more clarity is expected to come as cases, like Andersen, Ortiz and McKernan’s reach their verdicts and initiatives, like Schumer’s, get further along.
As more clarity is awaited, the entertainment industry has proactively formed a major coalition of organizations — more than 80, including the NMPA as well as ASCAP, BMI, RIAA, SESAC — called the Human Artistry CampAIgn. Announced at South by Southwest last month, it detailed seven core principles, intended to protect creators’ copyrights in the age of AI.
As No. 4 echos: “Governments should not create new copyright or other IP exemptions that allow AI developers to exploit creators without permission or compensation.”