Increased attention on issues like hate speech, online harassment, misinformation and the amplification of terrorist content has prompted policymakers around the globe to adopt stricter regulations including more responsibilities for platforms. Policy makers today not only expect platforms to detect and remove illegal content, but are increasingly calling on platforms to take down legal but undesirable or “harmful” content as well.
The Electronic Commerce (EC Directive) Regulations 2002 established the current liability regime in the UK. This EC Directive only imposes reactive liability on platforms where they are required to remove content upon becoming aware of content which is illegal.
In March of this year the Online Safety Bill was laid before Parliament which aims to protect children and tackle illegal and harmful content online. The Bill seeks to impose a statutory duty of care on platforms to protect their users not only from illegal content but also content that is harmful, with harm being defined as physical or psychological harm to individuals. It comprises of obligations on platform operators to minimise the presence of illegal content (to be defined in future regulations) and a requirement to take down any illegal content on becoming aware of it. To meet the duty of care, platforms will need to put in place reporting, systems and processes to improve user safety and protect against harm.
There are two categories of platform services recognised. Category one services, which cover the most popular sites with a higher number of users, such as Twitter and Facebook, will have greater obligations imposed on them than the other category of platforms, category two. Category one services will need to offer all adult users the option of verifying their identity and allowing adult users to block anyone who has not verified their identity.
The Bill also creates positive reporting obligations on platforms. For example, user to user services and search services will be under obligations to report child sexual exploitation and abuse content detected on their platform to the National Crime Agency.
Ofcom will be granted the power to impose fines of up to the higher of £18 million or 10% of the platform’s annual global revenue. Ofcom is also expected to be granted the power to impose criminal sanctions on directors and senior managers for more serious breaches of duty such as, where a platform fails to implement effective regulation systems.
The Online Safety Bill is likely to receive Royal Assent during 2023.
Other countries around the globe are also legislating on the issue of liability for online platform providers with some countries taking a stricter approach and imposing greater liability on online platform providers than others.
On 5 July 2022 the European Parliament approved the final draft text of the EU Digital Services Act which looks set to transform the EU framework for regulating online content. Whilst it largely preserves the liability principles established in the EC Directive, it goes much further in introducing new requirements for platforms and empowering regulators to investigate and enforce against non-compliance.
Austria and Germany have taken a hard stance on the issue. In Austria, new law in 2021 created obligations for all online platforms that either have a turnover of 500,000 euros or more annually, or have at least 100,000 users, to remove “content whose illegality is already evident to a legal layperson” within 24 hours and any other type of illegal content within 7 days after the platform has been notified of its existence. Failure to remove can lead to heavy fines, including in some cases up to 10 million euros. The law also imposes additional obligations such as releasing annual/quarterly reports.
Platform providers in Germany are obliged to introduce a “notice – and – action” system that offers users an easy to use mechanism to flag potentially illegal content. There are also extensive transparency reporting obligations about content removals. The regulations in Germany oblige platform providers to remove or disable access to manifestly illegal content within 24 hours of having been notified of the content.
As a whole there seems to be a trend appearing that legislators are holding online platforms more and more to account for their user content. There is a particular pattern occurring that obliges platforms to take illegal content down within 24 hours and anything else that has been flagged as offensive within 7 days.
Whilst the UK legislation is not currently officially enacted, online platform providers should be planning with the future regulations in mind by: (i) ensuring that they have the right terms and conditions in place with users specifying how they deal with not only illegal content but also harmful content; and (ii) building appropriate reporting, systems and processes to identify not only illegal content but potential online harms, mitigate against the harm occurring and respond appropriately once identified.
Please contact Paula Dumbill on 0116 323 0540 at [javascript protected email address] for more information on this subject, or to ask a question.
The information on this site about legal matters is provided as a general guide only. Although we try to ensure that all of the information on this site is accurate and up to date, this cannot be guaranteed. The information on this site should not be relied upon or construed as constituting legal advice and Howes Percival LLP disclaims liability in relation to its use. You should seek appropriate legal advice before taking or refraining from taking any action.