8 million people currently use Youtube Kids every week. Many kids also have access to the main Youtube platform, whether at school, home, or friend’s houses.
Over the past week, my Facebook feed has been filled with post after post with parents shocked about the various disturbing things coming to light about Youtube and Youtube Kid’s content that is targeted towards young kids and promotes such R-rated content as self-harm, sexual abuse, and drug use.
What many people don’t know is that this issue has been happening on Youtube for years.
In fact, this has been going on so long that Youtube resolved in 2017 to demonetize the channels which “made inappropriate use of family friendly characters.”
(Regarding the lesser known controversy, which only my blogger friends seem to be talking about – where pedophiles are commenting on videos to alert other perverts as to when “wardrobe malfunctions” or other moments they may find interesting may occur, Youtube has also been dealing with this problem – and the videos that may be intentionally created for this purpose – since 2017, as well.)
The first time I became aware of issues on Youtube was back in 2014 when a friend called me, absolutely freaked out.
Her daughter had been watching a cute “Count with Elsa” video on Youtube while she was doing a bit of cleaning. Suddenly, her daughter started screaming at the top of her lungs as Elsa morphed into a Zombie. My friend just wanted to get the image away from her daughter and exited out of the screen – not knowing to report it.
After the incident, her daughter was terrified of all things Frozen. Seeing Elsa on a cracker box in the grocery store was enough to set her off and bring back the nightmares.
At the time, I would sometimes use Youtube to play music for my daycare kids (“Barbara Ann” and “Let it Go” were on high rotation) but after the incident, I started seeing Youtube in another light.
Youtube is like Facebook – not Netflix
Many parents do not realize that just like how there is no “approval department” at Facebook approving each and every Facebook post, Youtube allows users to generate and post content at their own discretion.
Even Youtube Kids, which many parents view as safer, is not screened. The content creators who upload to Youtube Kids can be removed if they are reported for adding content that violates the terms of YT Kids (but that is also subject to the flaws in the reporting structure that Youtube has, which I discuss below).
Youtube is not a television network. They are not creating the (majority of) content on their platform and they are not subject to the telecommunications laws that govern the likes of the Disney Channel, etc.
As an adult who uses the platform and a content creator, I appreciate the amount of content and unique voices available to me but I’m also aware that just as Youtube allows amazing, talented, giving people to flourish on the platform, it also provides a platform for people who promote the worst aspects of our humanity.
For years, there have been accounts showing child abuse, racism, dead bodies, and other graphic content. Some of the most popular channels with the most dedicated audiences have featured these things. The dark side of Youtube is not new.
Why Isn’t Youtube Content Screened?
Youtube simply doesn’t have the manpower to screen every single one of the videos that is uploaded to it’s platform, and let’s be honest, it would take away from the benefits of the platform – which is that it has so much varied content from a wide variety of cultures, viewpoints, and people.
300 hours of video are uploaded to YouTube every minute. That would require 18,000 employees to be working at all times to even attempt to screen all of those videos before they are posted.
Now, I’m not here to make excuses for Youtube, and to be honest, I’d be surprised if Youtube comes out of this controversy unscathed because they have marketed Youtube Kids as aiming “to make it safer and simpler for kids to explore the world through online video.”
Quite simply, Youtube Kids has failed at that mission and they may end up being held accountable in court.
What Happens When a Video is Reported
So, Youtube’s official stance is that videos that are flagged are reviewed and dealt with, however, I know as a content provider that often multiple flags/complaints need to be made before this happens, and that the offence often needs to be pretty blatant for Youtube to take down a video.
Not only is this an issue of manpower, this is also, unfortunately, an issue of bullying.
Many Youtubers have young audiences and have no qualms about sending them out to attack their detractors or their competition. If videos were temporarily removed based on flagging, the unfortunate truth is that these bullies would abuse the system, which could have devastating effects on people who rely on their Youtube content to provide for their family.
(As a single mom whose family relies on her blogging income to pay ALL the bills, I can tell you if someone had the ability to shut down my blog and turn off my income on a whim, that would be devastating. This quick post isn’t the forum for discussing whether vlogging/blogging is a valid job for the 21st century, but I can tell you that the majority of bloggers I know are incredibly hardworking entrepreneurs.)
Even if we think this is the better solution, I can tell you, based on the manpower issue – this isn’t likely to happen.
What is Happening Now
After the major uproar from parents which has led to major ad buyers pulling their ads, Youtube is scrambling.
Recently, those of us whose children appear on our channels in any capacity have had Youtube remove monetization on our videos, and many videos no longer allow comments. This has also affected young Youtubers who are using their channels to save for college or pay for extra-curriculars.
While we can hope that Youtube will put in more safety measures, the fact is that they have been trying to do so since 2017 and we’re still seeing these horrendous videos in 2019. This is not going to be solved overnight or even within a few months.
I would guess that this is going to be an ongoing issue with these creeps figuring out loopholes in Youtube’s safety measures just as quickly as Youtube is able to implement them. A safe social media platform for young children is quite frankly, a pipe dream.
What Parents Can Do Instead
We all need a few minutes – believe me, I get it, and I am not going to be the SanctiMommy that comes and preaches about the dangers of screen time (although, they are real…) but even if you are going directly to the Official Peppa Pig Youtube channel, or making a custom “play list,” or selecting super strong security settings on YT Kids, there is always the risk of another (non-official show) getting turned on because of Youtube’s algorithms, your child accidentally tapping the wrong button, etc etc.
Even if you are sitting right beside your child, there is a possibility of one of the shows still coming on and scaring your child for a few seconds before you are able to turn it off.
The only safe option is to keep young children off of Youtube.
I understand that for a lot of families, Youtube is used because it is free. If you can’t work a Netflix subscription or some DVDs of your child’s favorite shows into your monthly budget, many libraries offer those same DVDs for rent. (Many also don’t charge late fees on items checked out on children’s cards.)
For older children, check out the channels that they are watching (watch a few videos yourself), have them watch Youtube where you can keep an eye on things, and have a conversations with them about the things they may encounter and encourage them to come talk to you about it.
We simply cannot trust that a social media network which relies on user-generated content will be able to keep it PG.
Hopefully, this controversy serves as a much-needed wake-up call to parents who trusted a social media network, or the decency of strangers.
When your child has an e-mail account, there will be bots and trolls out there hacking any and everything they can to get their e-mail addresses and send them inappropriate e-mails.
When your child is of age to start a social media profile, no amount of filters, privacy settings, etc, will be enough to stop a dedicated pervert.
Not to mention their online interactions with other kids being raised in the Wild West of the online world, where catfishing, cyberbullying and trolling are just parts of life.
The only thing that can keep our children safe on the internet is involved parents and guardians.