×

The U.K. has proposed an online harms bill that will govern tech firms that allow users to post their own content or interact. Digital and Culture Secretary Oliver Dowden presented the proposals to parliament on Tuesday.

Social media sites, websites, apps and other services that host user-generated content or allow people to talk to others online will need to remove and limit the spread of illegal content such as child sexual abuse, terrorist material and suicide content. The government is also assessing whether the promotion of self harm should be made illegal.

U.K. media regulator Ofcom is confirmed as the organization that will have the power to fine companies failing in their duty of care up to £18 million ($24 million), or 10% of annual global turnover, whichever is higher. It will have the power to block non-compliant services from being accessed in the U.K. The government will reserve the power for senior managers to be held liable.

The new regulations will apply to any company in the world hosting user-generated content online, accessible by people in the U.K. or enabling them to privately or publicly interact with others online. It will also apply to search engines because of the role they play in enabling access to harmful content online, and gaming sites, forums, messaging apps and commercial pornography sites.

“The legislation will include safeguards for freedom of expression and pluralism online — protecting people’s rights to participate in society and engage in robust debate,” the proposals state. “The laws will not affect the articles and comments sections on news websites.”

The proposals will be formally tabled before parliament in 2021 and could become law by 2022.

“I’m unashamedly pro-tech but that can’t mean a tech free-for-all,” Dowden (pictured, left) said. “Today Britain is setting the global standard for safety online with the most comprehensive approach yet to online regulation. We are entering a new age of accountability for tech to protect children and vulnerable users, to restore trust in this industry, and to enshrine in law safeguards for free speech.”

“We will not allow child sexual abuse, terrorist material and other harmful content to fester on online platforms,” Home Secretary Priti Patel (pictured, right) said. “Tech companies must put public safety first or face the consequences.”

“We’re really pleased to take on this new role, which will build on our experience as a media regulator,” Melanie Dawes, Ofcom’s chief executive, added. “Being online brings huge benefits, but four in five people have concerns about it. That shows the need for sensible, balanced rules that protect users from serious harm, but also recognise the great things about online, including free expression. We’re gearing up for the task by acquiring new technology and data skills, and we’ll work with Parliament as it finalises the plans.”

In response, Facebook head of U.K. public policy Rebecca Stimson said: “We already have strict policies against harmful content on our platforms, but regulations are needed so that private companies aren’t making so many important decisions alone.”

“Over the last few years we’ve invested billions in safety, tripled the size of our safety team to 35,000 and built artificial intelligence technology to proactively find and remove harmful content. While we know we have more to do, our industry-leading transparency reports show we are removing more harmful content before anyone reports it to us,” Stimson added.