My YouTube Channel’s Comment Section is Infested By Bots!

Over the past few months, my YouTube channel has been dealing with a growing problem: comment section bots. Not the kind that make life easier—these are fake accounts flooding the comment sections with spam, scams, and shady crypto coins. If you’ve browsed my recent videos, you’ve probably seen them. It’s reached the point where I wanted to dig deeper and understand what exactly is going on and why these bots are putting in the effort. You can see more in my latest video.

There are two main types I’ve been seeing. First are the so-called “porn bots”—fake profiles usually fronted by attractive women leaving strangely flattering comments. These messages are almost always generic, excessively positive, and often include strings of emojis. The bots generally post within a few seconds of the video’s publication so they’re the first comments viewers see.

Out of curiosity (and with some caution), I decided to follow one of the porn bot accounts using a cloud-based virtual machine. Sure enough, it led me down a chain of sketchy YouTube profiles and eventually landed me on fake dating sites filled with adult content and phishing attempts. They asked for email addresses, personal info, and even Google account access. It became very clear very quickly that the goal is to harvest data and monetize the clicks.

The fact that these comments keep showing up tells me it must be working on some level. Even if a fraction of a percent of viewers engage, the scammers are probably making enough to justify the volume of effort. That’s the part that’s hard to ignore—how much of this content is flooding into YouTube, and how many people might be falling for it.

The second type I’m dealing with comes a day or two after a video gains traction: crypto scam floods. These are typically pump-and-dump schemes, trying to generate hype around worthless coins before the inevitable rug pull.

What makes the crypto spam more frustrating is that it often drowns out real engagement. Dozens of comments flood in all at once from different accounts, pushing legitimate conversation out of sight. And if I’m not home and only have access to my phone, deleting them is a slow, tedious process—three taps per comment. I just can’t keep up.

YouTube does offer some tools to help—keyword filters, blocked link settings, and a “strict” comment moderation option. I had mine set to the default moderation before, but switching to “strict” actually made a noticeable dent in the crypto spam. It didn’t do much for the porn bots, though. Their messaging is too unpredictable for regular expressions to catch, which is likely how YouTube’s moderation currently works.

Out of curiosity, I also tried an experiment: I gave four sample comments (including a known bot) to ChatGPT, Gemini, and Grok to see if AI could spot the fake. All three nailed it. They picked up on the generic language, unnatural emoji use, and odd usernames. It showed me that the tech exists to filter this stuff out—it’s just not practical for YouTube to run large language models against every single comment given the scale of commenting on the platform.

I’ll keep doing what I can—blocking links, filtering known keywords, and manually flagging and deleting spam when I catch it. If you see these bots in the comments, the best thing you can do is ignore them. Don’t reply, don’t engage, and definitely don’t click on the links. I’ll take care of the cleanup as soon as I’m able.

Until YouTube finds a scalable way to tackle this, the infestation isn’t going anywhere. But at least now we have a better understanding of what’s behind it—and why staying clear is your best option.