Folk artist Murphy Campbell discovered fake versions of her own songs appearing on her Spotify profile in January. The songs used her recordings from YouTube, but someone had used AI to alter the vocals before uploading them without her permission.
This highlights a growing problem for musicians as AI tools make it easier than ever to steal and manipulate someone else’s work. Campbell never uploaded these particular songs to Spotify herself, yet there they were, generating streams and potentially revenue for whoever put them there.
The New Wild West of Music Theft
Campbell realized something was wrong when she heard the vocals didn’t sound quite right on songs she knew she had recorded. Someone had taken her YouTube performances, fed them through AI software to change the vocals slightly, then uploaded the modified versions to Spotify under her name.
This isn’t just about one folk singer. Musicians across all genres are finding AI-manipulated versions of their work appearing on streaming platforms. The technology has advanced to where anyone can take a song, alter it with AI, and upload it elsewhere.
The incident also shows how copyright trolls are getting more sophisticated. By using AI to slightly modify stolen content, they hope to avoid detection while still profiting from someone else’s creativity.
What Musicians Can Expect
Spotify and other platforms are scrambling to detect these AI fakes, but it’s becoming a cat-and-mouse game. Artists now need to regularly monitor streaming services to make sure unauthorized versions of their songs aren’t appearing under their names.
Campbell’s experience suggests this problem will get worse before it gets better, as AI tools become more accessible and harder to detect.

