It starts innocently enough. You plug your child into YouTube Kids for a few hours of peace and you pick a nice, harmless video like Peppa Pig or Spiderman cartoon series.
There’s no reason to believe that these antiquated animations from our childhood will weird out the little ones. After all, what was great back then should still be great now! That, and the fact that you’re on YouTube Kids— the kid-friendly channel of YouTube – should be enough to convince you that your kids are safe from potentially inappropriate material.
Ten minutes into the platform, however, and you notice Spidey is doing some pretty sketchy stuff to another character. Definitely not kid-friendly stuff, right?
The long and short of it: you stepped into the dark side of YouTube where bad actors rule and the platform’s algorithm are painful enablers.
Image credit: SparkyDoodles via YouTube
YouTube may not be a willing party to these dark forces, but the fact remains that they host these kinds of channels with vile videos hiding under seemingly innocuous thumbnails. And this same content is targeted at young children.
YouTube Kids does NOT offer a safer YouTube experience
Image credit: The Odyssey Online
From the “weird” videos that are not quite suitable for a young audience to unmistakably traumatizing and emotionally scarring stuff that should barely be viewed by adults, there are different levels of bad content, all designed to attract kids. From knockoff animations of Peppa Pig wailing during her visit to the dentist (the original being less horrifying) to Peppa outright burning a house down with someone inside, it’s the kind of stuff that should be moderated more carefully.
There are other disturbing videos where costume-clad people engage in violent behavior or become fathers of pregnant Disney princesses. These channels argue that their content is meant for adults who love wearing costumes but their videos look kid-themed, and thus attract young audiences.
Let’s not forget channels run by parents who use kids in the name of YouTube “entertainment, clearly crossing the line between charming and exploitative for monetary and other gains. These parents insist their kids are okay with going on YouTube but it’s hard to imagine how anyone would like to be puked on on a regular basis.
It’s painful to see these kinds of videos garnering millions of views. Imagine how many young minds have been indoctrinated to believe such things like kidnapping, bullying, burning people alive, or even something less sinister like mean pranks for the sake of online humiliation.
In response to this growing culture on the platform, YouTube has updated its policies and shutdown channels not adhering to them.
Image credit: Giphy
The problem, though, goes beyond creators making inappropriate videos. Abuse of the system happens because the algorithm itself lets it. This is the contention of many and which YouTube admits is part of the problem.
There’s a hole in YouTube’s algorithm where disturbing videos can slip through. Until YouTube changes its algorithm and makes the program better at detecting offensive content, there will be people who will continue to manipulate the system and use tags and titles that promote their videos to kids. We will continue to see Elsa in an animation video doing things we’d rather young children not see.
Finding and flagging these kinds of videos is easy if the videos have millions of views (which is cringe-worthy to think of if you equate that with the number of young minds the videos have been exposed to) but it could be a needle-in-a-haystack situation for those with fewer views.
The Adpocalypse, coined by PewDiePie and which started with him early this year goes on. This is where it hurts the most for YouTube creators whose sole YouTube income depends on their videos ads.
Because of YouTube’s inability to pin videos exploiting kids, some advertisers have decided to stop advertising on the platform completely for fear of their ads appearing on such videos. Advertisers like Toyota, Johnson & Johnson, Verizon, and AT&T, boycotted the popular video platform.
YouTube’s solution of weeding out bad actors on the platform by demonetizing their videos and requiring 10,000-lifetime views for others before they can generate revenue holds promise, but only as long as the machine identifies correctly videos with offensive content. As it is, YouTube’s algorithm has a long way to go before it truly becomes intelligent.
Image credit: Imgur
This means that advertisers will also continue to shy away from YouTube and creators will go on losing money from ads.
Considering that they are hemorrhaging millions of dollars, we can believe that they are, indeed, trying very hard.
If you’re a parent, you’ll just have to keep an eye on what your children are watching. If you’re a creator, avoid keywords that would link your content to questionable videos on the platform.
Hopefully, YouTube fixes the system soon so that advertisers’ trust will return and it will be all business as usual. Until then, keep growing your channel and remember that this, too, shall pass.
Date: January 2, 2018 / Categories: YouTube, / Author: Chell
ViewsReviews © 2023 - All Rights Reserved.
This website is in no way affiliated with any other Social Media website.