Israeli Ambassador Clarifies and Sets the Record Straight on Operations Epic Fury and...
Oh My God, Please Fire This Secret Service Agent Already
This State Might Soon Require Police to Check Immigration Status After Arrests
Color Us Shocked: NBC News Caught Lying About Secretary Hegseth's Comments to Families...
Kids Are Collateral Damage in New York's War on Charter Schools
Is Jeffrey Epstein Still Alive?
Democrats Only Care About Fiscal Responsibility When It Comes to Defense, but Not...
Steve Hilton Thanks Nick Shirley for His Work, As Newsom Turns a Blind...
The Power of Birthdays
Three Sentenced for Fraud Scheme That Enabled North Korean IT Workers to Infiltrate...
Trump Says U.S. Is 'Getting Very Close' to Meeting Objectives in Iran
GOP Lawmakers Introduce SNAP Fraud Reporting Act to Force State Data Sharing
Former Nodus Bank CEO Pleads Guilty to $24.9M Fraud and Sanctions Scheme
DOJ Sues Harvard Over Alleged Discrimination Against Jewish, Israeli Students
Tipsheet

How One Man Used AI to Steal Millions From Real Music Artists

How One Man Used AI to Steal Millions From Real Music Artists
AP Photo/Michael Dwyer

A North Carolina musician pleaded guilty on Thursday in what federal prosecutors call the first criminal case involving AI-assisted music streaming fraud in the United States.

Advertisement

The case could spark more conversation about how artificial intelligence is used and the role the government might play in regulating it.

Michael Smith admitted he used fake songs and fake listeners to steal millions in royalties from legitimate artists, according to the U.S. Attorney’s Office for the Southern District of New York. He pleaded guilty to a single count of conspiracy to commit wire fraud and agreed to forfeit over $8 million. He faces up to five years in prison.

Smith was accused of using AI tools to create hundreds of thousands of low-cost songs. He set up more than 1,000 bot accounts on services like Spotify, Apple Music, Amazon Music, and YouTube Music that were programmed to stream songs on repeat, thereby generating revenue. 

Smith estimated that his automated network could get more than 660,000 plays per day, translating into over $1 million in annual royalties. He ran the scam between 2017 and 2024 before a royalty watchdog flagged suspicious activity in his catalog and halted payments. 

Rolling Stone reported that Smith spread his faux streams across many tracks and services to make it “more difficult to detect.” A distributor flagged him for possible fraud. Smith claimed in an email that “there is absolutely no fraud going on whatsoever!”

Advertisement

Yet, at the same time, he was emailing partners, saying, “We need to get a TON of songs fast to make this work around the anti-fraud policies these guys are all using now.”

This is part of a growing problem regarding artificial intelligence-created music. The Rolling Stone report explained that this harms flesh-and-blood artists because streaming services pay them from a shared pool based on total plays. This means Smith’s fake songs “stole millions in royalties that should have been paid to musicians, songwriters, and other rights holders whose songs were legitimately streamed.”

Some experts estimate that as much as 10 percent of all streams could be fake, which costs the industry billions of dollars per year. 

Law enforcement officials say Smith’s case is an early example demonstrating how AI-enabled fraud is affecting streaming platforms as scammers use these tools to churn out vast libraries of content and then deploy bots or click farms to manufacture “listens” at scale.

From The Hollywood Reporter:

Streaming fraud has been a rampant issue in the music industry for years, a problem only exacerbated by AI now that fraudsters can quickly generate thousands of songs to flood the zone on streaming services like Spotify and Apple Music. The French music streaming service Deezer previously reported that it’s seeing 60,000 AI songs uploaded to its platform every day, further noting that as much as 85 percent of streams on those tracks are fraudulent.

As The Hollywood Reporter exclusively reported in February, Apple Music doubled its penalties for those caught engaging in streaming fraud, with the company saying AI’s impact on fraud was a factor in the decision.

Advertisement

Meanwhile, the music industry is struggling to figure out how to treat music generated by AI. The technology can already mimic human voices and compositional styles well enough that casual listeners might sometimes mistake fake artist with a real one. However, it isn’t that hard to tell the difference if you listen closely. Still, the technology is in its infant stages — with further advances, it could become closer to the real thing.

The question is: What happens when people aren’t sure whether their favorite artist is even human?

Editor’s Note: Do you enjoy Townhall’s conservative reporting that takes on the radical Left and woke media? Support our work so that we can continue to bring you the truth.

Join Townhall VIP and use promo code FIGHT to receive 60% off your membership.

Join the conversation as a VIP Member

Recommended

Trending on Townhall Videos

Advertisement
Advertisement
Advertisement