AI was a problem for Spotify long before Daniel Ek invested in AI weapons of war, causing a growing number of artists to remove their music from the platform in protest. Some artist pages on the streaming service have been infested with fake songs, many of them AI-generated. The people at Spotify apparently recognize this is a problem, and they've issued a statement about their plans to curtail it.
The press release begins like so:
Music has always been shaped by technology. From multitrack tape and synthesizers to digital audio workstations and Auto-Tune, every generation of artists and producers has used new tools to push sound and storytelling forward.
However, the pace of recent advances in generative AI technology has felt quick and at times unsettling, especially for creatives. At its best, AI is unlocking incredible new ways for artists to create music and for listeners to discover it. At its worst, AI can be used by bad actors and content farms to confuse or deceive listeners, push “slop” into the ecosystem, and interfere with authentic artists working to build their careers. That kind of harmful AI content degrades the user experience for listeners and often attempts to divert royalties to bad actors.
The statement lays out a three-pronged approach to combatting AI slop on Spotify. First, the service is implementing a new impersonation policy with stricter rules designed to prevent cloning of artists' voices. Second, they're rolling out a new spam filter intended to identify accounts that are uploading slop and stop recommending them to users. Lastly, they are working to develop a new industry standard for AI disclosure in credits: "We know the use of AI tools is increasingly a spectrum, not a binary, where artists and producers may choose to use AI to help with some parts of their productions and not others. The industry needs a nuanced approach to AI transparency, not to be forced to classify every song as either 'is AI' or 'not AI.'"
But if Spotify is focused on stamping out AI slop, they haven't gotten around to cleaning up Volcano Choir's page. The Bon Iver side project hasn't released an album since 2013's phenomenal Repave, and they've been on hiatus since 2014, when they toured and contributed an unreleased track to a charity compilation. Yet, as Futurism points out, a "new" "Volcano Choir song" recently appeared on the group's profile, and it has all the hallmarks of slop. Titled "Silkymoon Light," the track has what looks like AI-generated art, simplistic lyrics, vocals that only vaguely resemble Justin Vernon, and none of the slow-build post-rock dynamics that usually characterize Volcano Choir's music. If Spotify actually makes progress toward ridding their platform of this kind of thing, it will be better for everyone.
UPDATE: A spokesperson for Spotify has issued this statement to Stereogum regarding the fake Volcano Choir song:
We’re aware of the issue and have removed the content. AI is speeding up problems the music industry has faced for a while – like spam, fraud, and misleading uploads – which is why we recently rolled out new policies.
Since music moves through a complex supply chain, bad actors can sometimes exploit gaps in the system to push the wrong tracks onto artist profiles. We’re working closely with distributors to block these uploads at the source, and investing more resources into our content mismatch process, reducing the wait time for review, and enabling artists to report mismatches even in the pre-release state.






