YouTube tightens rules on AI-only videos
Google aims to stop ‘AI slop’ from flooding YouTube

YouTube will begin curbing AI-generated content lacking human input to protect content quality and ad revenue. Since July 15, creators must disclose the use of AI and provide genuine creative value to qualify for monetisation.
The platform’s clampdown aims to prevent a flood of low-quality videos, known as ‘AI slop’, that risk overwhelming its algorithm and lowering ad returns. Analysts say Google’s new stance reflects the need to balance AI leadership with platform integrity.
YouTube will still allow AI-assisted content, but it insists creators must offer original contributions such as commentary, editing, or storytelling. Without this, AI-only videos will no longer earn advertising revenue.
The move also addresses rising concerns around copyright, ownership and algorithm overload, which could destabilise the platform’s delicate content ecosystem. Experts warn that unregulated AI use may harm creators who produce high-effort, original material.
Stakeholders say the changes will benefit creators focused on meaningful content while preserving advertiser trust and fair revenue sharing across millions of global partners. YouTube’s approach signals a shift towards responsible AI integration in media platforms.
Would you like to learn more about AI, tech and digital diplomacy? If so, ask our Diplo chatbot!