The Joint Committee responsible for reviewing the controversial Online Safety Bill, has today (14 December 2021) published its recommendations. 

The Online Safety Bill is concerned with protecting users of social media platforms and search engines from various types of harm, mainly harm emanating from other users.

The Joint Committee has undertaken the gargantuan task of listening to the concerns raised by a wide cross section of society, considering how the Online Safety Bill can be improved, and making specific recommendations to how the Bill should be amended. Impressively, it has achieved all this within a few short months. This is all the more impressive when you remember that its starting point, the current draft Online Safety Bill, is frankly a bit of a hodgepodge.

The Joint Committee has recommended extensive, important, pragmatic and detailed changes to the Bill that should make it fit for purpose, provided those recommendations are taken forward.

The Joint Committee has also sought to make the Online Safety Bill more ambitious in some ways, for example by recommending that very harmful paid advertisements come within the scope of the Online Safety Bill, and to give Ofcom even greater powers. It has also made recommendations to improve the democratic credentials of the new regime, diluting the Secretary of State's (and government of the day's) power to amend the rules on a whim. The committee has also underscored the importance of protecting the freedom of the press, however, I very much hope that the views of ordinary citizens will not be eclipsed by journalists and news outlets that want to push their agenda. The press continues to play an important role in our democratic society, but so do ordinary people!

More commentary will follow, but in the meantime, the Joint Committee's main recommendations can be summarised as follows:

  • Ofcom should draw up mandatory Codes of Practice for internet service providers. For example, they should write a Code of Conduct on risk areas like Child Exploitation and terrorism. They should also be able to introduce additional Codes as new features or problem areas arise, so the legislation doesn’t become outdated as technology develops.
  • And they should require the service providers to conduct internal risk assessments to record reasonable foreseeable threats to user safety, including the potential harmful impact of algorithms, not just content.
  • The new regulatory regime must contain robust protections for freedom of expression, including an automatic exemption for recognised news publishers, and acknowledge that journalism and public interest speech are fundamental to democracy.  [Note: while I agree with this principle, I do not want there to be a two tier system in which ordinary people are given significantly weaker rights than journalists - including the right to respond or provide a counter argument via social media in relation to articles published by journalists!]
  • Paid-for advertising should be covered by the Bill, with the particular concern being scams and fraudulent ads.
  • Service providers should be required to create an Online Safety Policy for users to agree with, similar to their terms of conditions of service.
  • Big tech must face sanctions if they fail to obey the Online Safety Act, when it is passed, and comply with Ofcom as the UK regulator.
  • Ofcom's powers to investigate, audit and fine the companies found in breach of the new rules should be increased.

The Committee also believes the Bill should be clearer about what is specifically illegal online. They believe it should not be up to the tech companies to determine this. The Committee therefore agrees with the Law Commission’s recommendations about adding new criminal offences to the Bill. They recommend that:

  • Cyberflashing be made illegal.
  • Deliberately sending flashing images to people with photosensitive epilepsy with the intention of inducing a seizure be made illegal (known as Zach’s law).
  • Pornography sites will have legal duties to keep children off them regardless of whether they host user-to-user content.
  • Content or activity promoting self-harm be made illegal, such as it already is for suicide.

Further, the report recommends that individual users should be able to make complaints to an ombudsman when platforms fail to comply with the new law. They also recommended that a senior manager at board level or reporting to the board should be designated the "Safety Controller." In that role they would be made liable for a new offence: the failure to comply with their obligations as regulated service providers when there is clear evidence of repeated and systemic failings that result in a significant risk of serious harm to users.

Side note

Another committee of MPs is still looking into the same Online Safety Bill, and will report later on its own findings and recommendations. I only hope that doesn't muddy the water if they do end up suggesting  a different approach to the definition of 'online harms', for example.