/
1 spotify removed b tens of thousands d of songs created by an artificial intelligence AI platform and strengthened the surveillance system to detect such files and fraudulent activities. The streaming service confirmed that these tracks were linked to Boomy, a company providing tools to generate music from scratch in various styles using AI. Boomy, founded in California, United States, operates as a startup that promises users the ability to produce original songs in seconds and has reportedly helped develop more than 14 million audio tracks.
According to a person familiar with the matter, Spotify removed roughly 7 percent of tracks uploaded via Boomy, equating to tens of thousands of songs, following a warning from Universal Music Group UMG about odd behavior in reproducing these works. This movement comes as UMG raised concerns about copyright implications when training AI models on songs from major labels.
Another source close to the situation described an “artificial transfer” phenomenon, with Boomy-generated songs appearing to be recorded by online bots masquerading as listeners. Spotify, on its side, indicated it has removed content associated with Boomy and continues to work toward erasing all AI-generated material from its catalog.
It is important to note that Universal Music, which controlled a substantial portion of the recorded music market earlier last month, urged both Spotify and Apple Music to block the training of AI models that could infringe on the copyrights of songs in use.
1 This issue has sparked ongoing debate among industry professionals. Earlier this year, a coalition of artists filed lawsuits against Stability AI, DeviantArt, and Midjourney for copyright infringement over artwork produced with Stable Diffusion, highlighting concerns about AI-generated content across media formats. (Financial Times)
Scholars and executives point to a evolving tension between creative rights and the rapid advancement of generative AI in music. The discussions center on how AI systems learn from existing works, where the line lies between inspiration and replication, and what safeguards are necessary to protect artists and their catalogues.
Industry observers expect more policy clarifications as platforms continue to refine their content moderation and rights management tools. The Boomy episode underscores the need for transparent licensing models, user accountability, and robust detection mechanisms to distinguish authentic performances from machine-generated material. As the market monitors developments, companies in North America and beyond are weighing new approaches to verify provenance and ensure fair compensation for creators whose works may be used to train AI systems.
In the broader landscape, creators increasingly rely on AI to explore new sounds and streamline production workflows. Yet copyright owners maintain that without clear consent and licensing, AI-assisted outputs could dilute the value of original recordings. The industry will likely see a blend of voluntary codes, regulatory guidance, and marketplace safeguards aimed at balancing innovation with artist rights. (Music Industry News)