The regulations would create a statutory "duty of care" for social media companies such as Facebook and Twitter to protect young people who use their sites. The rules would be overseen by an independent regulator funded by a levy on internet companies.
"No one in the world has done this before, and it's important that we get it right," Media Secretary Jeremy Wright told the BBC. "And I make no apologies for the fact that we will put forward proposals here, which we believe are the right way to approach this, but we will then listen to what people have to say about them."
A 12-week consultation will now take place before the draft bill is published. While the United States has largely relied on market forces to regulate content in a country where free speech is revered, governments in Europe have signaled they are willing to take on the tech companies to block harmful content and prevent extremists from using the internet to fan the flames of hatred.
Britain's Home Secretary, Sajid Javid, criticized tech firms for failing to act despite repeated calls for action against harmful content. "That is why we are forcing these firms to clean up their act once and for all," Javid said.
Facebook's U.K. head of public policy, Rebecca Stimson, said the goal of the new rules should be to protect society while also supporting innovation and freedom of speech. "These are complex issues to get right and we look forward to working with the government and Parliament to ensure new regulations are effective," she said.
Wright insisted the regulator would be expected to take account of freedom of speech while balancing against preventing harm. "What we're talking about here is user-generated content, what people put online, and companies that facilitate access to that kind of material," he said. "So this is not about journalism. This is about an unregulated space that we need to control better to keep people safer."