YouTube decided that its top priority was to protect children when it disabled the ability for viewers to leave comments on videos featuring kids earlier this year — even though the video giant was aware the move would frustrate many content creators, according to CEO Susan Wojcicki.
In February, YouTube announced that it would shut off comments for videos with children that “could be at risk of attracting predatory behavior,” with the exception of a “small number” of creators with the resources to actively moderate their comments sections. That came in response to a scandal involving sexually coded comments left by child predators on certain YouTube videos, which led big advertisers including Disney, AT&T and Epic Games to suspend their advertising.
In a blog post for creators Tuesday, Wojcicki explained that YouTube strives to balance objectives of maintaining an open platform while enforcing community guidelines.
“We know how vital comments are to creators… I also know this change impacted so many creators who we know are innocent — from professional creators to young people or their parents who are posting videos,” she wrote. “But in the end, that was a trade-off we made because we feel protecting children on our platform should be the most important guiding principle.”
Wojcicki also wrote, “[G]iven the scale and impact of YouTube, there’s nothing more important than managing our role as a platform responsibly.”
YouTube has for years wrestled with the problem of objectionable and harmful content. In the wake of the terrorist attack in Christchurch, New Zealand, YouTube’s teams “immediately sprung into action to remove the violative content” and “chose to temporarily break some of our processes and features,” according to Wojcicki. That resulted in the suspension of some news and commentary videos that didn’t actually violate YouTube’s community guidelines, but “given the stakes, it was another trade-off that we felt was necessary,” she wrote.
According to a report last month by Bloomberg News, Wojcicki and other YouTube senior executives have ignored concerns of employees about the site’s recommendation engine serving up conspiracy-theory videos to users, allegedly because YouTube put a higher value on engagement and watch time. In a subsequent interview with the New York Times, Wojcicki insisted YouTube has not turned a blind eye to the issues and said the reality of managing the problems are complex. “It’s not like there is one lever we can pull and say, ‘Hey, let’s make all these changes,’ and everything would be solved,” she told the Times.
Wojcicki may cover similar talking points about “brand safety” and the platform’s pledge to be a responsible actor at YouTube’s Brandcast event for advertisers, to be held in New York City on Thursday (May 2).
Other points Wojcicki outlined in the April 30 post, part of a series of updates to the YouTube creator community:
- YouTube has set a goal of having at least 50% of trending videos coming from creators, with the remainder coming from music and traditional media companies.
- YouTube will provide more detail and tools for its community guidelines and advertiser-friendly policies. Wojckicki said the site’s “Self Certification” pilot program, which has been rolled out to over 1,000 channels so far, lets creators self-report how their video complies with ad policies “and build up trust that our systems adjust to.” Regarding monetization, YouTube will continue to focus on improving the accuracy of its system for classifying ad-compliant content.
- YouTube plans to do more to crack down on harassment on the platform. “While criticism from fellow creators can be constructive, any threats or doxing crosses the line,” Wojcicki wrote. “Such behavior is already prohibited by our policies. But stay tuned as we will do more to discourage this from happening on the platform.”
- YouTube remains “very concerned” about the European Union’s Article 17 (formerly referred to as Article 13) part of the EU’s recently passed copyright directive, Wojcicki wrote, because it could impose serious limitations on what YouTube creators can upload. She noted that each EU member state now has two years to enact laws in accordance with the directive, and urged creators to continue to lobby lawmakers to oppose new regulations that would make internet platforms liable for uploads of copyright-protected content.
- YouTube, in response to creator concerns that its “strike system” was inconsistent and confusing, updated its policies so that with the first violation a channel receives a one-time warning to learn about policies before they face penalties. After that, YouTube will start enforcing penalties and after three “strikes” a channel will be terminated.