The guidance, published on Wednesday, requires platforms to provide clear rules around uploading content; have easy reporting and complaint processes; and restrict access to adult sites. The rules are effective immediately.
“Uploading content relating to terrorism, child sexual abuse material or racism is a criminal offence,” the guidance states. “Platforms should have clear, visible terms and conditions which prohibit this – and enforce them effectively.”
Research commissioned by Ofcom shows that a third of users say they have witnessed or experienced hateful content. A quarter claim they’ve been exposed to violent or disturbing content. One in five have been exposed to videos or content that encouraged racism.
Platforms established in the U.K. are required by law to take measures to protect under-18s from potentially harmful video content; and all users from videos likely to incite violence or hatred, as well as certain types of criminal content.
Ofcom is empowered to investigate and take action against platforms which are found to be in breach of the guidelines. This could include fines, requiring the provider to take specific action, or, in the most serious cases, suspending or restricting the service. The regulator is currently in discussions with platforms on what their responsibilities are and what they are doing to comply with them.
However, unlike Ofcom’s broadcast remit, the regulator says its role “is not to assess individual videos.” “The massive volume of online content means it is impossible to prevent every instance of harm,” the regulator states. Ofcom was given its new role earlier this year when the government published a draft of the Online Safety Bill. The bill is currently undergoing pre-legislative scrutiny and will be formally introduced in parliament later this year.
The new regulations allow Ofcom to impose fines of up to £250,000 ($340,000) or 5% of the platform’s “qualifying revenue” – whichever is greater.
“Online videos play a huge role in our lives now, particularly for children. But many people see hateful, violent or inappropriate material while using them,” said Ofcom chief executive Melanie Dawes. “The platforms where these videos are shared now have a legal duty to take steps to protect their users. So we’re stepping up our oversight of these tech companies, while also gearing up for the task of tackling a much wider range of online harms in the future.”