The Online Safety Bill (the “Bill”) is currently progressing through Parliament and will become law later this year. The Bill will prohibit providers of user-to-user services (social media companies) from having illegal or harmful content on their platforms. Its purpose is to make the internet safer for all users, especially children. The regulations will be governed and enforced by Ofcom.
The Bill will impose a legal duty on providers to:
- enforce age limits and age-checking measures
- ensure risks and dangers to children’s safety are more transparent
- prevent children from accessing harmful and age-inappropriate content
- provide parents and children with clear and accessible ways to report online problems
- remove illegal or harmful content quickly or prevent it from appearing in the first place.
As it stands social media platforms have age requirements for users, but these are rarely enforced. This Bill will attempt to address this by making social media companies use age verification technologies.
The proposed legislation in its current form will apply extraterritorially, meaning that it will also apply to social media platforms originating outside of the UK.
All organisations will need to assess whether their platforms and services are likely to be accessed by children and will need to remove illegal content including:
- child sexual abuse
- controlling or coercive behaviour
- extreme sexual violence
- hate crime
- inciting violence
- illegal immigration and people smuggling
- promoting or facilitating suicide
- promoting self-harm
- revenge porn
- selling illegal drugs or weapons
- sexual exploitation
This content will have to be removed as quickly as possible. Failure to do so could lead to fines of up to £18 million or 10 percent of their annual global turnover, whichever is higher. Further, company executives risk prosecution or imprisonment if they fail to comply with Ofcom’s requests.
TikTok have recently been fined £12.7 million for illegally processing the data of underage children. This Bill goes further than previous legislation as TikTok and other social media firms will be responsible for making sure no underage users have access to their platforms. The information commissioner stated TikTok had done “very little, if anything” to check whether underage users were using the platform and failed to remove them. Social media platforms will have to do a lot more to protect children under this new legislation.
There is of course content that is not illegal but is still harmful to children. Platform providers will need to protect children from accessing this content under the Bill including:
- pornographic content
- online abuse, cyberbullying, or online harassment
- content that does not meet a criminal level, but which promotes or glorifies suicide, self-harm or eating disorders.
What does this mean for schools?
The onus does not shift solely to user-to-user providers. Schools should continue to keep safeguarding at the forefront of their minds and in line with Keeping Children Safe in Education (KCSIE). Once the Bill is passed, it is anticipated that KCSIE will expand further on the online safety section or separate guidance will be published.
We recommend that schools continue implementing online safety within their policies, provide training to staff and make sure they are aware of the risks, and cover online safety in the curriculum. This should cover how children can stay safe online and protect their data.
We will keep schools updated once the Bill becomes law and any further guidance that is published.