💼 Labor & AI

75,000 AI Songs Hit Deezer Every Day. Only One Streaming Platform Will Tell You Which Ones Are Real.

AI-generated tracks now account for 44% of all new music uploaded to Deezer. 85% of those streams are fraudulent. Spotify paid rights holders $11 billion last year but refuses to disclose its own AI numbers. Run the dilution math across the industry, and the silence starts to look expensive.

Abstract visualization of endless digital music waveforms flooding a dark server room, synthetic blue bars vastly outnumbering warm gold ones

Seventy-five thousand. That is how many fully AI-generated tracks Deezer reported receiving every day as of April 20, 2026. Put differently: 44% of all new music uploaded to the platform is now synthetic, more than two million AI songs per month, produced by tools like Suno and Udio that can generate a radio-ready track in under thirty seconds.

Fifteen months ago that number was 10,000 per day and accounted for 10% of uploads. Growth since then: 20,000 by April 2025, 30,000 by September, 50,000 by November, 60,000 by January 2026, and now 75,000. A 650% increase in just over a year, with no indication the curve is flattening.

Here is what makes this story unusual: Deezer is the only major streaming platform that publicly discloses these numbers, while Spotify, Apple Music, Amazon Music, and YouTube Music refuse to do the same. Deezer tags AI-generated content, removes it from algorithmic recommendations, detects fraudulent streams, and demonetizes them. Every other major platform either lacks equivalent systems, or has them and chooses not to talk about it, and either answer is a problem.

The Numbers Behind the Noise

Deezer's data tells a cleaner story than you might expect. Despite AI tracks composing 44% of uploads, they account for only 1 to 3% of total streams on the platform, a disparity that reveals something important about listener behavior: the overwhelming majority of paying subscribers still gravitate toward music made by human beings, whether consciously or through the algorithmic inertia of established playlists and curated recommendations. Listeners are not, by and large, choosing to hear AI-generated music, but what they are doing is being farmed. Deezer found that 85% of streams directed at AI-generated tracks are fraudulent, driven by bots designed to extract royalties from the shared revenue pool.

Royalties flow through a mechanism that is straightforward and merciless: streaming platforms pay from a common pool proportional to total streams, so every fraudulent stream directed at a synthetic track dilutes the per-stream payout to every human artist on the platform. It is not theft from a vault but from a shared pot, which makes it harder to see, harder to quantify, and harder to prosecute.

Meanwhile, human uploads to Deezer have barely moved, a contrast so stark it reframes the entire conversation from one about musical innovation to one about industrial-scale extraction. In January 2025, Deezer received approximately 90,000 non-AI tracks per day, and by April 2026 that figure sits around 95,500, a 6% increase over fifteen months that barely registers against the synthetic flood. Human music creation is growing at the pace of population, while AI music creation is growing at the pace of compute, and the gap between those two growth rates has been doubling roughly every four months with no sign of convergence.

What Happens When You Extrapolate

Deezer has roughly 9 million subscribers, a fraction of the market when Spotify commands 260 million, Apple Music claims an estimated 100 million, and Amazon Music, YouTube Music, and Tidal add tens of millions more. The combined streaming market paid rights holders approximately $22 to $24 billion in 2025, with Spotify alone accounting for $11 billion.

If Deezer's 44% upload ratio is even roughly representative of the wider industry, the absolute numbers become staggering, implying that hundreds of thousands of synthetic tracks flood the combined catalogs of Spotify, Apple Music, Amazon Music, and YouTube Music every single day without public acknowledgment from any of those platforms. Music Business Worldwide reported 100,000 tracks uploaded to Spotify daily as of 2023. That figure has almost certainly grown since, and applying Deezer's 44% ratio yields approximately 44,000 AI-generated tracks hitting Spotify every day, potentially more given that Spotify lacks Deezer's visible deterrent effect.

Now the dilution math: Deezer reports that AI tracks capture 1 to 3% of total streams before detection. If that rate holds across platforms that lack detection, and the global streaming industry pays out $22 billion annually, the undetected royalty drain falls between $220 million and $660 million per year. At the high end of AI stream penetration, it approaches $720 million. These are rough estimates, but they use Deezer's ratios as a proxy for an industry that refuses to publish its own data, which is exactly the point.

A joint study by CISAC and PMP Strategy, conducted with participation from multiple major industry players including Deezer, concluded that nearly 25% of creators' revenues are at risk by 2028, implying annual losses approaching €4 billion ($4.3 billion) at current trajectory. That projection assumes no intervention and presumes the other four major platforms continue not to disclose, not to detect, or not to demonetize at Deezer's rate.

PlatformAI Upload DisclosureAI Content TaggingFraud Detection ReportedDemonetization Policy
DeezerYes (quarterly)Yes (since June 2025)85% fraudulentYes, excludes from royalties
SpotifyNoNo public systemNot disclosedPolicy stated, no data
Apple MusicNoNo public systemNot disclosedNo public policy
Amazon MusicNoNo public systemNot disclosedNo public policy
YouTube MusicNoNo public systemNot disclosedNo public policy

The First Criminal Prosecution

Michael Smith, a singer-songwriter from North Carolina, pled guilty in early 2026 to what the U.S. Department of Justice called the first criminal prosecution for AI-powered music streaming fraud. Smith generated thousands of synthetic tracks, deployed bot farms to stream them continuously, and extracted more than $8 million in fraudulent royalties from Spotify, Amazon Music, Apple Music, and other platforms before being caught.

He was one person, and he stole $8 million. Nobody caught him in time. Deezer alone detects fraudulent AI streams at a rate that suggests the aggregate problem is orders of magnitude larger. Smith's prosecution matters less as a deterrent and more as a proof of concept, a demonstration that the fraud works and that detection at other platforms was insufficient to prevent an $8 million extraction before law enforcement intervened.

The 97% Problem

Deezer commissioned an Ipsos survey of 9,000 people across eight countries and found that 97% could not distinguish AI-generated music from human-made music in a blind listening test. Eighty percent of respondents agreed that fully AI-generated music should be clearly labeled, a consensus that held across every demographic group and country surveyed and that stands in jarring contrast to the near-total absence of labeling infrastructure on four of the five major platforms. The gap between those two numbers is the crisis in miniature: people want labeling, but the labeling only works if platforms implement detection, and only one platform does.

Deezer's detection system uses patent-pending technology filed in December 2024, trained to recognize signatures from generative models like Suno and Udio. It can be extended to detect output from new models as long as training data is available, and Deezer began licensing this technology to other industry players in January 2026, though it has attracted only one known customer so far, the Hungarian platform Eji. Whether Spotify, Apple Music, or Amazon Music are evaluating the technology remains unknown because none has said.

The Strongest Case Against Alarm

Deezer's numbers may overstate the problem for the broader industry, and several structural factors suggest caution before extrapolating its ratios to platforms with fundamentally different user bases, content moderation philosophies, and relationships with major record labels. Deezer is the smallest of the major streaming platforms with roughly 9 million subscribers compared to Spotify's 260 million. Its user base skews European and younger, demographics that may attract a disproportionate share of synthetic content uploaders seeking to exploit smaller platforms with potentially weaker moderation infrastructure and lower barriers to catalog saturation. Its 44% upload share could reflect Deezer-specific dynamics rather than an industry-wide ratio.

More importantly, the critical datapoint cuts against panic: only 1 to 3% of streams on Deezer go to AI-generated tracks, even though they are 44% of uploads. Listeners self-select toward human music, or at least toward music with established audiences and playlist placements, which AI slop rarely earns organically. If the market is already sorting real from synthetic without intervention, then the problem may be primarily one of server costs and catalog pollution rather than meaningful financial harm to artists.

There is also the commercial incentive question, because Deezer publishes alarming numbers about AI flooding while simultaneously selling detection technology to address it. Credit where it is due: the transparency is real, the data is verifiable, the methodology is consistent across quarters, and Deezer deserves genuine recognition for being the first and so far only major streaming service willing to publish numbers that make the entire industry look complicit by comparison. But "buy our fraud detection tool" is a motive worth noting when evaluating the urgency of the pitch. Music Ally, the industry publication, flagged this explicitly in its coverage: "There's nothing shady in that strategy, but it's relevant context."

What This Analysis Does Not Prove

Our $220 million to $720 million dilution estimate extrapolates from Deezer's disclosed ratios to an industry that declines to publish its own. Spotify and Apple Music may have different AI upload rates, different fraud rates, and different internal detection capabilities they choose not to disclose. CISAC's €4 billion risk projection uses unpublished methodology, and we cannot independently verify its inputs. Deezer's 85% fraud detection rate is self-reported and has not been independently audited. Ipsos tested recognition of AI music at the level of casual listening. Professional musicians and audio engineers may perform differently, and we have no data on that cohort. Suno and Udio are not the only generative music tools. Detection systems trained on their outputs may miss tracks from newer or less prominent generators. The data has gaps.

The Bottom Line

If you are a working musician whose income depends on streaming royalties, ask your distributor which platforms detect AI-generated content and which do not. Demand that Spotify, Apple Music, and Amazon Music publish quarterly AI upload disclosures equivalent to Deezer's. If one platform can tag 13.4 million AI tracks in a single year and demonetize 85% of their fraudulent streams, the technical capability exists. The question is whether other platforms face sufficient pressure to deploy it before the EU AI Act's labeling requirements take effect in August 2026.

If you manage a rights catalog or invest in music publishing, the CISAC €4 billion risk figure should inform your due diligence. Ask every platform that distributes your catalog for its AI detection rate, its fraud demonetization rate, and its upload-to-stream ratio for flagged content. If they will not answer, the Michael Smith case established that the fraud is real and that at least $8 million slipped through before anyone noticed. How many more Michael Smiths are currently running bot farms across platforms that refuse to publish detection rates, extraction volumes, or even acknowledge the synthetic composition of their upload streams, and how much royalty money has already been siphoned from working musicians who have no way to measure what they have lost? That is the question Deezer answered for its 9 million subscribers. The other 400 million paid streaming subscribers are still waiting.

Sources

  1. Deezer Newsroom (April 20, 2026). AI-generated tracks represent 44% of new uploaded music; 75,000 AI tracks per day; 85% of AI streams fraudulent; Ipsos survey of 9,000 respondents across 8 countries. Deezer Newsroom
  2. Spotify Newsroom (January 28, 2026). Spotify paid $11 billion to the music industry in 2025, up 10% from 2024. Spotify Newsroom
  3. U.S. Department of Justice (2026). Michael Smith guilty plea for AI-powered music streaming fraud exceeding $8 million. DOJ Press Release
  4. CISAC/PMP Strategy (2025). Joint study: 25% of creators' revenues at risk by 2028, potentially €4 billion. CISAC
  5. Music Ally (April 21, 2026). Analysis of Deezer's AI upload data, context on commercial incentives behind disclosure. Music Ally
  6. Music Business Worldwide (2023). 100,000 tracks uploaded to Spotify and other services daily. Music Business Worldwide
  7. Spotify Newsroom (September 25, 2025). Spotify strengthens AI protections for artists, songwriters, and producers. Spotify Newsroom