More than three years after launching the tyke-targeted YouTube Kids app — which has turned out to not as clean and well-lit as YouTube had initially touted — the video giant is going to introduce features to help parents handpick exactly what content their children are allowed to see.
In the next few months, the YouTube Kids app will add a “white-list” feature, something parents and guardians have been requesting for a while. With the “parent-approved content” setting, adults will be able to specifically select every single video and channel available in the app.
In addition, YouTube will rollout pre-screened content collections in collaboration with “trusted partners.” Those will be available starting this week, with initial collections from Sesame Workshop and PBS Kids; YouTube says it will add more partners over time.
The third new parental control coming to YouTube Kids is a new option to set search settings to limit results to only channels that have been “verified by the YouTube Kids team.” That’s in addition to the app’s existing ability to let parents disable search entirely.
“While no system is perfect, we continue to fine-tune, rigorously test and improve our filters for this more open version of our app,” James Beser, product director for YouTube Kids, wrote in announcing the feature.
Popular on Variety
The move comes after reports last year that found some clearly out-of-bounds videos were available in YouTube Kids — including disturbing videos that use children’s characters in freakish, violent or sexual situations.
In response, YouTube adopted a new policy to be more proactive in age-restricting such inappropriate content. According to YouTube’s policy for age-restricted content, as part of determining if videos should be blocked from YouTube Kids, moderators will evaluate vulgar language, violence and disturbing imagery, nudity and sexually suggestive content, and the portrayal of harmful or dangerous activities.
Meanwhile, a coalition of consumer groups earlier this month filed a formal complaint with the FTC, alleging that YouTube illegally tracks data on kids who are under 13. YouTube says it disables accounts if it becomes aware that a user is under 13 and says it introduced YouTube Kids specifically for younger children. In addition, YouTube says it enforces policies that restrict advertisers from targeting personalized ads to children under 13 or from collecting personally identifiable information from children who are underage.
Last fall, Google said the YouTube Kids app has generated more than 70 billion views to date and has more than 11 million weekly active viewers.