YouTube is inundated with AI-generated slop, and that’s not going to change anytime soon. Instead of cutting down on the total number of slop channels, the platform is planning to update its policies to cut out some of the worst offenders making money off “spam.” At the same time, it’s still full steam ahead adding tools to make sure your feeds are full of mass-produced brainrot.
In an update to its support page posted last week, YouTube said it will modify guidelines for its Partner Program, which lets some creators with enough views make money off their videos. The video platform said it requires YouTubers to create “original” and “authentic” content, but now it will “better identify mass-produced and repetitious content.” The changes will take place on July 15. The company didn’t advertise whether this change is related to AI, but the timing can’t be overlooked considering how more people are noticing the rampant proliferation of slop content flowing onto the platform every day.
The AI “revolution” has resulted in a landslide of trash content that has mired most creative platforms. Alphabet-owned YouTube has been especially bad recently, with multiple channels dedicated exclusively to pumping out legions of fake and often misleading videos into the sludge-filled sewer that has become users’ YouTube feeds. AI slop has become so prolific it has infected most social media platforms, including Facebook and Instagram. Last month, John Oliver on “Last Week Tonight” specifically highlighted several YouTube channels that crafted obviously fake stories made to show White House Press Secretary Karoline Leavitt in a good light. These channels and similar accounts across social media pump out these quick AI-generated videos to make a quick buck off YouTube’s Partner Program.
Gizmodo reached out to YouTube to see if it could clarify what it considers “mass-produced” and “repetitious.” In an email statement, YouTube said this wasn’t a “new policy” but was a “minor update” effort to confront content already abusing the platform’s rules—calling such mass-produced content “spam.”
Not exactly.. to clarify, this is a minor update to our long-standing YPP policies to help us *better identify* when content is mass-produced or repetitive. This type of content has already been ineligible for monetization for years, and is content viewers often consider spam
— TeamYouTube (@TeamYouTube) July 3, 2025
However, under the new guidelines, content that uses AI-generated voiceovers “without any personal commentary or storytelling” may be ineligible for making a quick buck. The same goes for any “slideshow compilations” with “reused clips,” “reaction or recap-style content with little original insight,” or anything that follows “highly repetitive formats, especially in Shorts.”
YouTube Shorts is still the premier place for most of these AI slop channels. In June, YouTube CEO Neal Mohan championed a new tool for generating Shorts “from scratch.” Mohan proposed that this tool could essentially generate both the video and audio for videos, which is especially ironic since the tools used for AI models, including Google’s Veo 3, were trained on YouTubers’ content without their express permission.

It remains unclear what content falls under this idea of “highly repetitive formats.” Would a series of fake Harry Potter vlogs so annoying you would want to shove the preteen back into his cupboard under the stairs count as “repetitive” content? It’s all vague enough that we can imagine many of these slop creators will slip through the cracks. By its nature, content moderation is imperfect, but the way today’s grifters are able to monetize slop, even if some videos don’t get much traction, emblemizes the holes in Google’s open-handed approach to AI. There are a growing number of accounts offering get-rich-quick advice by sharing how people can upload AI-generated videos assembly-line style—something that would seemingly violate YouTube’s own policies.
Even if slop channels put an ounce more effort into making each video seem less like “spam,” the quality is certainly going to remain subpar. Google and YouTube want to push AI as king, but the inevitable result will be a worse platform for everybody. Like the name suggests, slop slides downhill, and creators and viewers will be the ones swimming up to their eyes in the muck.