Last February 2015, YouTube launched a standalone video content app for users under 13 called YouTube Kids. The app allowed the little ones to enjoy kid-friendly videos such as songs, cartoon shows, and dance videos (Ben-Yair, 2015). The addition of a YouTube platform that specifically caters to a younger audience is more than just a separate app, but a call for online content to be age-appropriate. Over the next years, YouTube became the catalyst for several changes that reflects how social media can be a safe place for children.
Nowadays, you won’t find that many children playing on the streets with rocks or sticks. They usually begin their day with a screen in front of their eyes. A study conducted by knowthenet.org.uk revealed that approximately 59% of children have already used a social network by the time they are 10 (Graber, 2014). It’s either their parents give them access to the media sites, or they manage to open their own accounts. In fact, 13 is the minimum age to sign-up on Facebook, Twitter, Instagram, Pinterest, Tumblr, Kik, and Snapchat.
With YouTube, it requires account holders to be 18. However, a 13-year-old can open an account with a parent’s permission. In other words, minors are present in the online space. With this in mind, one cannot help but wonder, do their social media experiences reflect the child aspect in them?
A social media user is also a content creator. Whatever you upload on the Internet will be put to another person’s feed. Undoubtedly, lurking still is a pool of content that varies on the theme, subject, and message. More often than not, some posts are not meant for children to see.
Several sensitive data are not comprehensible by the young mind. A simple post that contains suggestive violence or hate can stir adverse psychological effects. That’s because they are not able to understand well yet. Their cognitive capacity to make smart decisions is still developing between 12 and 18 (Gruenewald, 2017). It’s normal for a child to observe and model other people’s behaviors. But what if they copy the wrong example which they saw online?
Meanwhile, picture this. A child signs up for an account on Facebook. He types in his name, address, contact number, and email address (or his parents’). The mere fact his personal information is registered on the Internet is a red flag sign that he can be at risk. Any information about the child can be misused, a phenomenon called Child Identity Theft (CIT).
Having an account as a minor exposes information to potential abusers who can take advantage of their name and manipulate account credits. A child’s personal information is like bait on a hook. Frauds will grab hold on when the hook dips on the water. Again, the child won’t detect such criminal acts since they hold so little knowledge about the world. Fortunately, there is something we can do about this.
Apart from the Children’s Online Privacy Protection Act (COPPA), a law designed to protect children’s personal information under 13 online, we have technological advances to mitigate the harmful risks.
With children becoming more avid users of social media, YouTube has stepped in to use technology as a blanket of safety for them. They have provided users with the tools to contribute to making the online world an age-appropriate and kid-friendly space.
First, content creators can utilize their privacy settings in terms of the sharing of access and age-restrictions. They can do the former by selecting whether their videos are private, which means only people with the link can view it or choose to be public. For the age-restriction tool, all they have to do is check between “made for kids” or “not made for kids” before uploading their videos. In other words, content creators are expected to adhere to YouTube’s Community Guidelines.
This serves as a guide to determine whether their videos should have age restrictions or not. To watch age-restricted videos, the users must be signed-in, and their account age must be 18 or older. A warning prompt will be shown if they don’t meet the account qualification and will be redirected to search for age-appropriate content.
YouTube has also come up with a Trust & Safety team that aids in reviewing content for double protection. They inspect contents and apply age-restrictions when they come across videos that need it. Their actions are balanced with making sure that the viewer must be 18 years of age or above and is signed-in. No matter where the videos are opened, only the appropriate audience can view them.
At the core of new policies and technological aid for age-restrictions lies the creation of online experience parents and kids don’t have to worry about. They can easily access videos that bring smiles on their faces and giggles on the sofa without running into inappropriate content. These views are worth gold to creators, which is why some will even buy real views on their YouTube videos hoping to create a feedback loop!
In the near future, YouTube envisions a more meticulous approach in evaluating content. They want a machine that can automatically apply age-restrictions based on their established policies. There will be little to no impact on the revenue of content creators who are a part of the YouTube Partner Program. Limited or no ads apply to those who violate advertiser-friendly guidelines.
YouTube is responding to a call many companies should follow—age-appropriateness in modern times. Everyone should have a safe and secure way for younger generations who, too, someday will be online creators. Starting today, we can develop a sense of responsibility in what we show to the Internet. Let’s not leave them with marks we cannot heal. Let’s not open any opportunity for misused information. There are factors we can control, so let us do it. YouTube is urging us to evaluate how we put out content and take the necessary actions to protect the children.
Date: February 17, 2021 / Categories: Analytics, YouTube, / Author: Joy P