Major players in the music and gaming industries are moving from legal disputes over copyright to strategic collaborations with generative AI companies. This shift focuses on creating licensed AI tools that ensure artists and rights holders are compensated and maintain control over their work.
A key example is the recent settlement between Universal Music Group (UMG), the world’s largest record company, and the AI music startup Udio. Previously, UMG, alongside Sony Music and Warner Music, had sued Udio for allegedly using copyrighted music without permission to train its AI models. The settlement has resulted in a new partnership. According to the companies, Udio will launch a subscription service allowing fans to create new music, such as remixes, based on a licensed catalog of songs. The agreement requires that UMG artists must opt in to have their music included, and any user-generated tracks must remain on the Udio platform. Michael Nash, UMG’s chief digital officer, stated that artists who opt in will be paid for the use of their music in both training the AI and in the new works created by subscribers.
This collaborative approach is becoming an industry-wide trend:
- Spotify has announced a partnership with all three major labels—UMG, Sony, and Warner—to develop “responsible” AI music products that create new revenue streams and strengthen artist-fan connections.
- UMG has also partnered with Stability AI to develop music creation tools specifically for artists, producers, and songwriters, emphasizing that the AI models will be trained responsibly.
AI partnerships extend to gaming
The trend is not limited to music. Video game company Electronic Arts (EA) has also partnered with Stability AI to co-develop AI tools for its game developers. Described by an EA executive as “smarter paintbrushes,” these tools are intended to accelerate creative workflows, such as generating realistic 2D textures for 3D environments. Prem Akkaraju, CEO of Stability AI, said these collaborations place creators at the center by building AI tools around their specific needs.
Across these agreements, a common framework for “responsible AI” is emerging. It is built on the principles of using licensed data for training, obtaining consent from creators, and establishing clear models for compensation.
Sources: Wall Street Journal, Billboard, Variety, Variety