The Wayback Machine - https://web.archive.org/web/20190811205146/https://www.wired.com/story/youtube-kid-troubles-kids-core-audience/
YouTube Has Kid Troubles Because Kids Are a Core Audience

YouTube Has Kid Troubles Because Kids Are a Core Audience

Casey Chin; Getty Images

YouTube Has Kid Troubles Because Kids Are a Core Audience

Casey Chin; Getty Images

YouTube has a child exploitation problem. In February, the platform disabled comments on millions of videos that include children 13 and younger after WIRED UK revealed that pedophiles had used the feature to identify clips depicting nude or sparsely clothed children. On Monday the company faced new criticism after researchers from Harvard’s Berkman Klein Center said YouTube’s algorithm recommends videos of young children to users with viewing histories consistent with the profile of a child predator.

Paris Martineau covers platforms, online influence, and social media manipulation for WIRED.

But YouTube’s troubles with children, and the content created by and for them, extend far beyond these text­book instances of exploitation. In a way, YouTube’s problem is YouTube itself. Much of the platform rewards and amplifies exploitive actions by staking creators’ revenue and clout on a handful of metrics—such as view counts and ad impressions—that are easily gamed.

Among adults, this system contributes to what Data & Society researcher Rebecca Lewis calls an alternative influence network of inflammatory far-right YouTube creators, serves up polarizing recommendations that can inspire conspiracy theorists, and generates YouTube’s distinctly salacious genre of clickbait. This toxic brew is all the more dangerous for a more vulnerable (and difficult to measure) group of users: kids.

YouTube claims that its core product “has never been for kids under 13.” Yet the recent investigations, and other data, show just how central young children have become to the site’s profitability and popularity, as both creators and viewers.

After Monday’s Berkman Klein report, critics called on YouTube to stop recommending any videos featuring children under 13. YouTube balked, instead publishing a blog touting past tweaks it has made in the name of child safety (e.g., minors can only livestream when accompanied by an adult; content that comes close to violating its community guidelines can be excluded from recommendations). YouTube also noted it removed more than 800,000 videos for potential child-safety violations in the first quarter of 2019, the majority of which were taken down “before they had 10 views.” YouTube did not respond to a request for comment before publication. After publication, YouTube said it has invested significantly in YouTube Kids and heavily markets the product to parents, to encourage them to direct children to YouTube Kids rather than the main service.

Here’s what those changes didn’t involve: YouTube’s revenue stream.

Children under 13 have emerged as one of the most lucrative demographics for creators, reflecting the rise of YouTube Kids—which was introduced in 2015 as a safe semi-controlled space for kids to experience the site—the growing ubiquity of the YouTube babysitter, and recent rules limiting YouTube creators’ ability to easily monetize videos featuring “adult” content.

Videos for children account for 12 of the 20 most-viewed YouTube videos during April, according to Tubular, a social media analytics tool. The most popular kids’ video in April was “The Boo Boo Song | CoCoMelon Nursery Rhymes & Kids Songs,” which recorded a whopping 226,198,404 views, about 100 million fewer than the most popular video overall during the period, “Step Up: High Water” a YouTube Premium show. The channel, CoCoMelon, likely earns between $638,000 and $10.2 million a month from ad revenue, according to estimates by analytics site SocialBlade; the wide range reflects variables in the prices advertisers could be paying.

Young children are also the face of many of YouTube’s most popular family vlogging channels, which boast tens of millions of followers. The children spend hours unboxing presents and reviewing toys for an audience, like RyanToyReview, and chronicle the minute details of their lives—from kindergarten graduations to playdates—like the young star of Kids Diana Show, which boasts over 28 million subscribers. SocialBlade estimates that Diana’s mother, who runs Kids Diana Show and stars in some videos alongside her daughter, could earn up to $4.2 million a month in ad revenue from the channel.

CoCoMelon and Kids Diana Show did not respond to requests for comment, which isn’t surprising. It’s often not clear who’s producing top kids’ videos. Last month The Wall Street Journal attempted to speak with the people behind the top 10 YouTube kids channels, but couldn’t confirm who ran nine of the accounts. While WIRED could find the names of some of the people associated with the businesses behind the accounts, we couldn’t confirm their identities either.

In CoCoMelon’s case, the business and its trademarks are owned by a company called Treasure Studio, but there’s little publicly available information hinting at who is behind the operation. The California business license was registered by an Irvine couple, whose names match an H-1B visa sponsorship request for an animator, but WIRED could not confirm they operated the YouTube channel. Multiple requests for comment at the phone numbers and email addresses associated with the couple and their businesses went unanswered.

Critics say these channels’ anonymity weakens accountability. Over the past three years, dozens of high-profile YouTube channels have been shuttered following allegations of child abuse. The parents behind one channel known for its so-called pranks, DaddyOFive, lost custody of the two kids featured in their popular videos and eventually pleaded guilty to child neglect in 2017.

A boom in kids’ videos featuring disturbing subject matter—like artificial insemination, drinking bleach, and other jarring imagery—has embroiled YouTube in scandal since 2016. The videos used classic child animation tropes and knock-offs of popular characters to evade YouTube Kids’ content filter, where they racked up millions of views and ad dollars.

YouTube removed many of the most disturbing videos following public outcry, but the current state of kids’ YouTube is undeniably jarring. Many of the top videos share characteristics that point to an intense emphasis on engagement and, ultimately, revenue over quality. These videos are often so poorly and nonsensically animated that it is difficult to tell whether they are the result of some rudimentary auto-generation software or actual human handiwork. Plots are rare; instead, the content often merely consists of a list of objects, animals, or colors, ostensibly aimed at improving a child’s reading comprehension.

The words spoken in these videos rarely come from a human and instead appear to be low-quality audio files from text-to-speech software, giving the whole affair a rather dystopian vibe. Most of the other noises featured in these videos sound as if they were lifted straight from GarageBand’s sound effects library. They’re disjointed yet oddly familiar, imbuing many of the videos with an ASMR-y effect.

Somehow, this all culminates in a wealth of views anyway. The absence of a plot or any discernible entertainment value is irrelevant. Kids are watching regardless. Maybe it’s the jarring nature of it all that attracts them—or the bizarre titles, which are so SEO-optimized, presumably to game YouTube's recommendation algorithm, that they're more akin to word salad: “Learn fruits and Animals with funny Monkey style PC games | Educational Videos for Kids.” Whatever it is, it clearly works. And so long as some part of YouTube continues to reward this type of low-cost content with millions of easily monetizable views, they’ll continue to be made—and consumed by the site’s most impressionable consumers—en masse.

Updated, 6-7-19, 11:30am ET: This story has been updated to include a comment from YouTube received after the story was published.


More Great WIRED Stories