3 March 2020

Policing the internet; online harms

Regulatory framework

In 2002 I wrote a paper which discussed whether the internet could be policed using existing remedies in tort. Fast-forward 18 years and the Government now proposes to establish in law a new duty of care towards internet users.

Other nations are working on new laws and rules to deal with online harms. However, the UK will be the first country to actually do this and set out a regulatory framework.

Companies should be held accountable

According to the White Paper published in 2019, companies will be held to account for tackling a comprehensive set of online harms, ranging from illegal activity and content to behaviours which are harmful but not necessarily illegal. This has come about from growing concerns about the use of the internet in facilitating the harmful activity.

The duty of care will make companies deal with the harm that is caused by information or activity on their sites, and all companies that fall under the regulatory framework will need to be able to demonstrate that they are meeting their duty of care.

 

Contact our Licensing and Regulatory team now.

 

The Government is minded to appoint Ofcom as the regulator for online harms. Whether the appointed regulator will have sanctioning powers to fine companies who are in breach of their duty of care obligations and to make senior managers take responsibility for not meeting their duty of care is to be seen.

Current thinking is that the regulator will set out how a company meets its duties and obligations in codes of practice. The detail of the codes of practice, in turn, will be dictated by the Government which will tell the regulator what goes into the code of practice.

A number of stakeholders will require to be consulted to input into the codes of practice, including the police, with regards to the harm that is illegal (for example the incitement of violence).

The regulator will have the power to ask for transparency reports from companies which will give information about how much harmful information and material there is on their sites and what they are doing to deal with this. The reports will then be put online by the regulator so that users can have all the information that they need to make a decision about how they use the internet. The regulator will also have the powers to seek additional information, such as how algorithms work in choosing what information people see and making sure that companies report on harms that they know about and new harms that they see coming up.

The regulator will push companies to make it easier for researchers to access the information that they need so that they can see how well these companies are doing in keeping people safe.

Who will come under regulation?

Those companies that allow users to share or search for information written by users, or companies that allow users to talk or communicate with each other online. This will include:

  • Social media platforms
  • File hosting sites
  • Public discussion forums
  • Messaging services
  • Search engines

 

Ofcom’s role

However, the Bill which is currently progressing through Parliament is simply a bill to grant to Ofcom, the media watchdog, powers to prepare and publish a report. The Bill also provides that Ofcom’s report must contain recommendations for the introduction of an Online Harms Reduction Regulator and recommendations for the duty of care to be imposed on internet companies to prevent a variety of harms from terrorism to electoral fraud to any other harm that Ofcom deems appropriate.

The scope of the duty of care to be imposed on internet companies is an area in which I predict lively debate. If the duty is to be as wide as conjectured in the Bill, internet companies really would be obliged to police their users to discharge that duty. The question is how will the internet companies go about doing that? Do they actually need to be given more powers or be exempted from, say, GDPR, if they are to “police the internet”?

No doubt, there will be a significant debate on this subject on social media platforms. But it is the House of Lords and the House of Commons that need to debate these issues first. Let’s see where we are in the next 18 years.

How can we help?

As the Bill is broadly focused on two new sets of requirements – around illegal content, with a particular focus on terrorist and child sexual abuse content, and the second, which attempts to minimise the distribution of harmful content with a focus on self-harm or suicide -we can help you to get ahead of the curve by reviewing your content to ensure that it does not fall foul of the pending regulations and mitigating against prosecution.

 

For advice or more information, please contact Kamal Chauhan at kchauhan@hcrlaw.com or on 0121 726 7460.

Share this article on social media

About the Author
Kamal Chauhan, Partner

view my profile email me

Got a question?

Send us an email

x

Stay up to date

with our recent news


x
LOADING