Kids love videos—the sillier the better. And it’s a rare parent who hasn’t used them to secure a little quiet time. Today YouTube is, by far, the largest source of videos of all kinds. When they created an app for children in 2015, many parents assumed the content would be carefully curated and reliably child-friendly.
Much of it is. YouTubeKids lets even young children happily swipe through a vast collection of content, much of it featuring familiar characters like Winnie the Pooh, Peppa Pig and PAW Patrol. Education clips are also plentiful, many from reputable sources like Khan’s Academy and PBS Kids. Mixed into this stew are videos created by users, which vary enormously in content and quality. A small percentage includes bizarre and even traumatizing images, sometimes of those same beloved characters doing lewd and violent things.
How does this happen? Google uses artificial intelligence to decide whether a video is suitable for children. Although AI has come a long way, it doesn’t always spot problems that would be glaringly obvious to people. It may, for example, miss the nuance that distinguishes adult satire from the innocent content it’s meant to mock. And it’s often oblivious to trolls and clickbait—content created simply to lure clicks that generate revenue.
In its defense, Google warns that children may encounter inappropriate content and asks that parents flag such material so other kids won’t see it. Of course, that’s a significant change. In the past, parents could assume children’s media was created with the wellbeing of kids in mind. On YouTubeKids, at least some of the videos are created to satisfy algorithms, stringing together content associated with key words in ways that are at best nonsensical and at worst disturbing. Google keeps changing its policies in an effort to stay ahead of so-called bad actors, but often it seems the robots and their handlers are playing catch-up.
Even when content is properly curated, parents need to be aware that children see a lot of commercial messages on YouTubeKids. The Red subscription may be free of paid advertising, but children still have access to entire channels created by companies like Hasbro or McDonalds. They’re also likely to encounter unboxing videos, short segments in which someone breathlessly unwraps a toy or a sweet, a process that seems designed to incite cravings in kids.
Unfortunately, the parental controls for YouTubeKids are very limited. Parents can’t set their own filters for content or create playlists of acceptable videos such as those reviewed by Common Sense Media (commonsensemedia.org/youtube-reviews). Most kids will still explore by swiping, so it’s good to know about these options:
Change the password. Find the “Grown-Ups Only” section in the YouTubeKids app, and unlock it by using the random four digit passcode. The numbers are spelled out so pre-readers can’t use the code. For any child at the edge of literacy, find the “Set My Own Passcode” button and do it.
Disable search. Searching for videos increases the likelihood that children will see something unsuitable. Google allows parents to set up a profile for each child, so search can be enbled or disabled, depending on the child’s age and self control. Off should be the default. Tap the lock icon in the lower right, enter the password, choose settings, create or find your child’s profile, and toggle off search.
Review history. Because YouTubeKids doesn’t have filters, parents can’t necessarily keep kids from seeing something they don’t want them to see. The app does make it easy to review history, which at least allows a conversation, after the fact, about why a video is objectionable.
Block videos you don’t want your child to see. If you see something unsuitable for your child, block the video or the entire channel. Just tap the triple-dot button for the video and then tap “Block.”
Report videos no child should see. Reporting gets the attention of human screeners who are actually counting on conscientious parents to let them know about unsuitable content that slipped by the robots. Think of this as a public service. If you see something, say something by tapping the triple-dot button and then “Report.”
Set limits. To its credit, YouTubeKids does include a timer. Once it’s set, a colored progress bar lets your child see how much time is left in a session. When the clock runs out, a “Time’s Up” animation appears and the app locks until a parent enters the access code.
Consider other options. Last but not least, consider other options. YouTubeKids may have the largest collection of videos but, when it comes to kids, quality is preferable to quantity. Companies like Disney, Nick Jr, and PBS Kids have brands to protect so they are likely to be more careful about what appears in their apps. For other possibilities, consult the list of video alternatives compiled by Common Sense Media (www.commonsensemedia.org/lists/streaming-video-apps).
Regardless of where your child watches videos, talk often about what your child is seeing and ask questions that develop critical thinking skills. Why does your child like certain characters? Are they behaving in a way that would be OK if a real person did it? Why is something funny? Did your child learn anything from the video? Is someone trying to get them to buy or do something? Having these conversations helps children become more discerning about what they watch, a skill that will be only become more valuable as they get older.
Carolyn Jabs, M.A., has been writing about families and technology for over twenty years. She is also the author of Cooperative Wisdom: Bringing People Together When Things Fall Apart, a book that describes a highly effective way to address conflict in families, schools and communities. Available at Amazon and cooperativewisdom.org.
@ Copyright, 2018, Carolyn Jabs. All rights reserved