After several years of delay, the Government published its long-awaited online harms regulatory regime proposals on Tuesday 15th December. The result of those delays is a system which does not quite ‘lead the world’, as was the original intention (the EU published its Digital Services Act on the same day, for example). However, the proposals still represent a significant change in the scale of regulation for the largest social media platforms and means by which the rules are drawn up.


This has been driven by a cross-party and global consensus that change is needed to reign in the worst online activity. Ministers have stated that the ‘era of self-regulation’ for large tech platforms is over. However, the delays were caused, among other matters, due to concerns about impacts on freedom of speech caused by platforms exercising control of content that was legal but controversial. This was reinforced by extensive advocacy from the news media sector which Oliver Dowden, the Secretary of State recognised yesterday. Under the government’s proposals, news media organisations were ruled out of scope of the regulations to avoid just such concerns and now the platforms will have to provide new channels for users to challenge takedown of their content. To paraphrase Mr Dowden’s comments, he said ‘this is not to prevent adults from accessing content they disagree with.’


The Government will require the largest social media platforms to justify takedown of content deemed to be harmful but legal in addition to all tech companies in scope addressing the very worst (illegal content) issues. The result is a proposed system which requires businesses to make a risk assessment of threats on their platform, working with the regulator to agree an appropriate set of measures to mitigate and address those harms. The size, scale and risks will determine the extent of activity each business is committed to.




As with any complex area of legislation there will be a number of (as yet) undefined areas. In addition, the system will need flexibility to react to changes in technology and human behaviour in years to come. A case in point is the new regulator Ofcom. First established seventeen years ago – who could have foreseen its future role as a social media regulator at that time?


In light of this, there are a number of areas that warrant further attention, including:


  • First, the harms will be identified and codes of conduct focusing on systems and processes for dealing with them will be drafted by the regulator. There are questions such as how will this be developed and what flexibilities will be afforded to Ofcom? Will interpretations and definitions of the harms change over time and how will stakeholders react to this? Government has said there will be parliamentary oversight but is keen to ensure Ofcom’s independence.


  • Second, the most significant platforms will be required to provide transparency reports on their moderation activity and risk identification. There will be a lot of interest in what is considered appropriate, and how this is presented in the public domain. Stakeholders will want to understand what yardstick will be used. Yesterday in Parliament we saw some comments about the industry ‘marking its own homework’ which demonstrate how some view the system.


  • Third, platforms will be split into two categories. Only the higher tier, with greatest reach and impact, will be required to publish transparency reports and take action on legal but harmful content accessed by adults (all will have to protect children and address illegal content). The activity is intended to be proportionate the size of each platform, and the level of risks they identify. One of the big questions will be how this proportionate approach is determined. The government has proposed a process based on thresholds for a variety of factors, advised on by Ofcom. The jury will be out on how effectively the system is explained and communicated to the wider world.




Of course there will be many more questions that those posed here. And the next forum for them to be aired is pre-legislative scrutiny. Ministers have said a draft bill is likely to be introduced into parliament in the new year with full legislation following after. This means that the full regulatory system is still some way from implementation, although Ofcom can now begin preparatory work following confirmation of its role.


The coming months (and years) as parliament looks at these significant proposals will undoubtedly see politicians and stakeholders debate the flexibilities and definitions noted above. For example, we are likely to see a desire from some quarters to secure a more prescriptive approach, perhaps mandating particular mitigations to the identified harms, or expanding the list of issues addressed.


Taken together, these new measures have the potential to be far-reaching and of significant impact. However, the flexibilities mean parliamentary scrutiny is likely to be intense, with many calling for clarity on the terms to which platforms will be held.