To Parents: don’t panic about Momo – worry about YouTube Kids instead.

I first heard about Momo. Someone had screenshotted a Facebook post about a creepy puppet that supposedly appeared in unsuspecting children’s phone messages and spliced into YouTube videos, dispensing advice on self-harm and violent acts. I reacted with suspicion: this would hardly be the first time that something on Facebook turned out not to be true, and the Momo challenge seemed a bit too on the nose – too obviously sinister – to be real.

It turned out that Momo was indeed a hoax, a viral shock-story driven by a frightening image and well-intentioned worry about children’s safety online.

There have been videos on YouTube Kids with suicide advice spliced into otherwise innocuous cartoons as a malicious “joke” – they just don’t involve Momo. Parents have spotted them before; the American pediatrician Free Hess recorded and documented one on pedimom.com. And this is, lamentably, the tip of the iceberg when it comes to inappropriate content on the video platform, even on the version that’s supposedly curated for kids.

YouTube has been battling disturbing videos for years, but a 2017 Medium post by the writer and artist James Bridle brought the problem to widespread attention, kicking off a slew of stories about the various horrors that could be found through the YouTube Kids app. Frightening videos of Peppa Pig at the dentist or Mickey Mouse being tortured were appearing in searches. Weirdly sexualized videos of Disney princesses were easy to find. Supposedly “family-friendly” channels showed children wetting themselves, being injured or screaming in apparent terror – a father who ran one such “prank” channel allegedly lost custody of two of his children as a result.

YouTube has removed a lot of the worst videos that used to be rife on the platform, but they just keep coming, finding new ways to get around the algorithm. The most recent major scandal involves the discovery of a “soft pedophile ring” operating in YouTube comments, where men leave chilling comments on videos of children and exchange numbers to share further images, as reported by The Verge.

YouTube’s key failing here is that it relies on a “flagging” system to find and purge inappropriate content, which means someone has to actually see the video in question and report it before anything can be done. Pre-moderation, where videos don’t make it on to YouTube Kids until they’ve been watched in full by a human being, is realistically the only way to keep the platform safe from malicious pranksters. But YouTube has shown no appetite for this, instead emphasizing its “robust” content-reporting features in its responses to these continual controversies.

When you download the YouTube Kids’ app, it tells you as much in the introductory screens: “We work hard to offer a safer YouTube experience, but no automated system is perfect.” No shit. The truth is that YouTube was never intended to be a platform for children, and I have zero faith in its ability to adapt itself to that role.

Even on the less extreme end of things, YouTube can be a parenting minefield. When my teen stepson was a train-obsessed five-year-old who couldn’t even read yet, we once left him watching videos of trains pulling into stations on the iPad for a few minutes and returned to find him innocently watching a video of a train accident that had appeared in the recommendations. Nowadays, with him having long since graduated from kids’ YouTube to obnoxious gaming channels, we have regular dispiriting conversations about whichever of his favored YouTube celebrities have recently done something incredibly stupid like drop the N-word on a stream or told someone in the comments to kill themselves. That’s not even to mention the “alt-right”, anti-social-justice personalities who the algorithm regularly feeds to young male users watching Call of Duty compilations, or the dangerous flat-Earth or antisemitic content that the platform has recently been forced to address.

The majority of YouTube Kids content isn’t distressing or disturbing – but it is mostly brain-numbingly terrible. A vast amount of the kid-friendly videos that are uploaded are straight-up garbage: cheap, algorithm-driven songs or nonsensical stories featuring 3D models or toys of popular characters such as Elsa, Spider-Man and Peppa Pig. They are designed purely to extract views and thereby money from common search terms – not to entertain or educate kids. Friends with young children regularly complain about the inane surprise-egg or toy review videos that have become household obsessions. My toddler would watch cheap, repetitive, unbearably cheery nursery rhyme videos for an hour if I let him.

The easiest solution for parents of young children might be to purge YouTube from everything – phones, TVs, games consoles, iPads, the lot. This is the approach we’ve taken in our household, which inconveniently contains two video games journalists and, consequently, an absurd number of devices. You don’t need to be a tech Luddite to find YouTube Kids both irritating and vaguely worrying. There is no shortage of good children’s entertainment available on Netflix, through BBC iPlayer and catch-up TV, or through advert-free games designed for young players. And there’s zero chance they’ll come across any suicide tips there.

By Keza MacDonald is video games editor at the Guardian.

Check Also

55 percent of US citizens oppose Biden’s ‘megadeal’ with Saudi Arabia: Poll

Fifty-five percent of US citizens say they oppose a defense pact with Saudi Arabia that …

Leave a Reply

Your email address will not be published. Required fields are marked *