The UK’s Online Harms Bill: Potential Implications for the Right to Privacy

Richard Wingfield
The GNI Blog
Published in
5 min readSep 23, 2020

--

Following a GNI-led, multistakeholder roundtable discussion with local experts on the U.K. Online Harms White Paper, Richard Wingfield of Global Partners Digital explores the often-overlooked privacy implications of the current regulatory approach.

Those following digital policy will be well aware that governments around the world are introducing or considering new regulation of online platforms and content. One of the more ambitious sets of proposals came from the UK, whose Online Harms White Paper was published last year. The proposals, in essence, were for online intermediaries to be subject to a new legal “duty of care” to prevent certain forms of illegal (and possibly legal but harmful) content and activity on their services.

The UK government is expected to publish its full response to the Online Harms White Paper consultation before the end of the year (an interim response having been published in April), and to introduce legislation in 2021. While much of the focus from human rights defenders has been on the potential risks to freedom of expression, the proposals could also have significant implications for the right to privacy, which I want to explore in this blog post.

Unlike regulation in many countries which only targets publicly available content, the proposals in the Online Harms White Paper extended to private communication services as well, on the grounds that “users should be protected from harmful content or behaviour wherever it occurs online, and criminals should not be able to exploit the online space to conduct illegal activity.” As such, the new duty of care potentially applies as much to private, encrypted messaging platforms, such as WhatsApp or Signal, as it does to publicly facing social media platforms and search engines.

The government has been clear from the outset that there will be different expectations for private communication services, but it has been less clear on precisely what the duty of care will require from companies when it comes to the private services they facilitate. While the White Paper itself said that there will be no requirements under the legislation for companies “to scan or monitor content” on private communication services, two key questions remain. First, what will be considered a “private communication service,” and second, what requirements, if any, will be imposed instead?

On the first question, assuming the regulation outlines differing expectations for public and private communication services, more rigid definitions of “private communication services” could pose greater risks to privacy. If, for example, a service was no longer considered “private” once a particular number of people were communicating in a group, then even an encrypted group chat could potentially be considered as “public,” triggering an obligation to scan or monitor content for illegality. Furthermore, implementing this obligation would likely require removing or undermining end-to-end encryption, fundamentally weakening the privacy of those participating. Many of the responses to the consultation argued that any encrypted channel should be considered “private” and that arbitrary numbers should not be used to determine whether a channel is “private” or “public.”

With a broad spectrum of possible approaches the new legislation could take, the second question, covering specific requirements, even for private communication services, is also key. A light touch approach, with, for example, the duty of care for private communication services being limited to ensuring that users have the opportunity to report content which is illegal or in breach of the companies’ terms of service, would not cause too much concern (many platforms already offer this functionality, although it does raise questions about the privacy of the others in the channel). If, however, companies were required to add a functionality by which law enforcement agencies could be invisibly added to private communication services (along the lines of GCHQ’s so-called “ghost proposal”), or even to monitor or share the metadata of private communication services to identify risks of harm, then substantive concerns from a privacy perspective emerge.

A further concern would stem from any requirement for companies to report potentially illegal content to law enforcement and other government agencies. The White Paper was clear that the legislation would set out expectations on companies to cooperate with law enforcement agencies. While it would be one thing for this to mean retaining copies of content where needed for investigations or prosecutions of criminal offences, any expectation that companies should proactively send potentially illegal content, along with associated details about users, would only exaserbate the privacy concerns highlighted above.

In response to these concerns, it is often noted that very significant illegal and harmful activity takes place on private communication services, from the sharing of child sexual abuse imagery and plotting terrorist attacks to children being bullied or even groomed, and that those who create these channels should not be able to evade accountability for the harms that they facilitate. But it is simply not possible to only restrict the privacy of “bad actors” when it comes to these channels. By undermining the privacy and security of these channels for everyone who uses them, the government would also put at risk human rights defenders, journalists, minorities and many others who rely on them to be able to communicate freely, without fear of the harassment, violence, or other consequences that they would otherwise face.

The importance of encryption and other privacy protections when we communicate online is well recognised under the international human rights framework, with reports from UN Special Rapporteurs and Resolutions of the UN Human Rights Council emphasising the important role these protections play in ensuring all human rights can be enjoyed online. While private communication services certainly raise challenges for governments, the answer cannot be to undermine the privacy upon which we all rely — some for their very physical safety and security. The UK government should take a strong and principled position when it publishes its response, ensuring that the proposals protect not only freedom of expression, but the right to privacy as well.

Richard Wingfield is the Head of Legal at Global Partners Digital

--

--

Richard Wingfield
The GNI Blog

Head of Legal at Global Partners Digital. Posts are written in a professional capacity. @rich_wing on Twitter.