AI music is relatively new, but scams have been around for ages. The advent of AI has made it so much easier for people to scam companies out of millions of dollars, and this next story is a prime example. An AI “musician” was just indicted for scamming streaming services out of a total of over $10 million.
How an AI “musician” was indicted for scamming streaming services out of millions
While making money off of AI-generated music should be considered borderline stealing, this story involves more direct defrauding. The person behind bars now is named Michael Smith, and he’s been up to his plan for much longer than most people knew about generative AI.
In fact, this scam began all the way back in 2017, five years before ChatGPT even hit the market. According to the legal text, Smith was able to generate a ton of AI-generated music tracks. This music was distributed through an unnamed AI company, and that company got a cut of the ill-gotten profits. We’re not sure what sort of tool he was using, but we’re sure that it was pretty rudimentary.
In any case, he was able to generate tracks in the order of thousands. When he did that, the company then distributed those tracks en masse through several streaming services like Spotify, Amazon Music, YouTube Music, and Apple Music. That’s bad enough, but that was only the first step.
A fake audience of listeners
Next, Smith, along with unnamed partners within and outside of the U.S., created a large number of accounts for these services. These were bot accounts that would “listen” to these AI-generated tracks constantly. Each time the songs were listened to, they’d generate a small amount of revenue, and that would add up.
A scheme as lucrative as this needed a massive number of accounts. According to the report, Smith owned 52 cloud service accounts, and each of those had 20 associated bot accounts. This brought the total number of bot accounts up to 1,040.
Each one of these accounts would stream their AI-generated tracks each day. Every account was able to stream up to 636 tracks each day, so that brought the number of daily streams into the neighborhood of 661,440 daily.
The scammers were careful to spread the streams across a vast number of tracks. This was to avoid raising any red flags. It would seem suspicious if 1,040 accounts randomly started listening to the same songs over and over again.
The numbers
They earned about half a cent per stream, which equates to about $3,307.20 every day. Remember, this went on for seven years!
Each month, they gained $99,216 and each year, they pulled in about $1,207,128. Over the seven-year stretch, they stole more than $10 million from these companies. Now, $10 million is absolutely microscopic for companies like Amazon, Apple, Spotify, and YouTube. However, they’re still not happy about it.
We don’t know how much of the money Smith was able to keep for himself, but he moved $1.3 million to one of his own bank accounts in the States between 2020 and 2023.
The lies continue
Aside from the money he kept for himself, Smith then moved some of the stolen money to a debit card distributor based in Manhattan. He had a list of fake names, each associated with email addresses and streaming accounts. Well, he lied and led the establishment to believe that these names were employees at a company that Smith owned. Thus, it distributed cards to these individuals and they were used to pay for the subscriptions for those bot accounts.
A very complicated plan that all came crashing down after seven years. Hopefully, the platforms are able to identify the AI-generated tracks and remove them.
This is a first, but not really
This case was announced by the U.S. Attorney’s Office for the Southern District of New York. In the announcement, it was referred to as the “first criminal case involving artificially inflated music streaming.” The indictment was a sizeable 18 pages.
Right now, we’re still waiting on a bunch of information like how long Smith will be behind bars, who his partners were, whether they’ve been caught or not, etc. So, we’ll keep you updated as more details come out.
As for the case, it’s a bit scary to know that there have been thousands of AI-generated tracks across the music stream industry for years. Back in 2017, the peak of consumer AI was Google Assistant. So, knowing that there were tools out there since then that could have generated content is a bit worrying.
This case, while it does deal with AI-generated content, doesn’t really set the pace for other AI content cases going forward. Again, this dealt with direct defrauding and scams. It’s not about people generating revenue from generated content or the use of copyrighted content to train models.
As such, the legality of AI-generated music will continue to be murky. It’s going to be that way as more people find ways to forgo creativity and distribute their AI-generated “music” to the masses. Hopefully, there will be some sort of legal framework in place soon.