Navigating the Online Safety Act: what businesses need to know
For businesses operating in the digital space, staying ahead of legislative changes is critical. The new Online Safety Act (“the Act”), a landmark piece of legislation, is set to transform how businesses manage online content and user safety. Designed to create a safer online environment, this Act imposes new responsibilities on businesses that operate online platforms, services and applications. In this blog, we'll break down the key elements of the Online Safety Act, explore its implications for digital businesses and provide guidance on how you can prepare for compliance.
Understanding the Online Safety Act
The UK Online Safety Act is a comprehensive piece of legislation aimed at regulating online content and ensuring the safety of users, particularly children. It puts a range of new duties on social media companies and search services, making them more responsible for their users’ safety on their platforms.
The core objectives of the Act are:
Protecting children from harmful content: platforms must ensure that children are protected from accessing harmful and age-inappropriate content, while also offering parents and children straightforward and accessible methods to report any issues that may occur online.
Promoting transparency: the Act will also safeguard adult users by requiring major platforms to be more transparent about the types of potentially harmful content they permit, while providing users with greater control over the content they choose to view.
Ensuring accountability: Ofcom has now been designated as the independent regulator for online safety. It will establish guidelines in codes of practice, outlining the steps providers can take to meet their safety obligations. Additionally, Ofcom will have extensive authority to evaluate and enforce compliance with the regulatory framework.
Who does the Act apply to?
The provisions of the Act apply to search services and services that allow users to post content online or to interact with each other (user-to-user platforms), with links to the UK. This includes a range of websites, apps and other services, including social media services, consumer file cloud storage and sharing sites, video sharing platforms, online forums, dating services, and online instant messaging services.
The Act also applies to services provided by companies outside the UK if they have connections to the UK.
Key changes and requirements for businesses
The Act passed into law on 26 October 2023. Ofcom is leading work to implement the Act’s provisions and is taking a phased approach to bringing duties into effect. The Act requires Ofcom to develop guidance and codes of practice that will set out how online platforms can meet their duties. Ofcom will have wide-ranging enforcement powers, including issuing fines of up to £18 million or 10% of worldwide revenue (whichever is greater).
The Online Safety Act introduces several new obligations that businesses must adhere to:
Proactive risk assessments - providers of certain online services will be under a duty to conduct a “suitable and sufficient” illegal content risk assessment in relation to that service. The risk assessment must identify (i) the risk of users encountering illegal content; (ii) the risk of the service being used for the commission or facilitation of a priority offence; (iii) the risk of the service’s functionalities facilitating the dissemination of illegal content or the commission or facilitation of a priority offence; (iv) the risk of harm to individuals (due to the above risks around illegal content and priority offences) and (v) mitigation measures, i.e. the design and operation of the service (including the governance, business model, use of proactive technology, measures to promote media literacy and safe use, and other systems and processes);
Children’s access assessment - providers of certain online services will be under a duty to carry out a Children’s Access Assessment (CAA), which should detail: (i) whether it is possible for children (those under 18 years old) in the UK to access all or part of the service; and (ii) if so, whether there is a significant number of child users in the UK or whether the service is likely to attract a significant number of child users in the UK.
Safety by design - online service providers that are governed by the Act must take or use proportionate measures relating to the design or operation of the service to prevent individuals from encountering illegal content, to mitigate and manage the risk of the service being used to commit or facilitate an offence, and to mitigate and manage the risk of harm to individuals.
Transparency reports - there will be an obligation on online service providers to publish annual transparency reports detailing how they are tackling harmful content on their platforms. These reports should include information on the amount of harmful content removed, the actions taken to protect users, and the resources allocated to content moderation.
Content moderation - online service providers that are governed by the Act are required to have robust systems in place for moderating content, including the use of algorithms, human moderators, and automated tools to detect and remove harmful material.
User empowerment - platforms must provide users with tools to manage their online experience, such as the ability to filter out harmful content or restrict who can interact with them.
Appeals and redress mechanisms - users must be provided with a clear process to appeal content removal or account suspensions, ensuring fair and transparent decision-making.
Preparing for compliance: practical steps for businesses
To ensure compliance with the Online Safety Act, businesses that will be governed by the Act should consider the following steps:
Conduct a comprehensive risk assessment: identify potential risks associated with your online services and develop a strategy to mitigate these risks.
Review and update content moderation policies: ensure your content moderation policies are in line with the requirements of the Act and that your systems can respond swiftly to harmful content.
Implement age verification and safety controls: introduce effective age verification measures and provide tools for parents to control their children's online experiences.
Enhance transparency and user communication: make your content policies clear to users and provide them with easy access to reporting and appeal mechanisms.
Stay informed and engage with Ofcom: keep up to date with guidance from Ofcom and participate in consultations to ensure your business is prepared for any further changes.
Conclusion
The Online Safety Act marks a new era in online safety regulation, with significant implications for businesses across the digital landscape. By understanding the requirements and taking proactive steps to ensure compliance, businesses can not only avoid penalties but also foster a safer, more trustworthy online environment for their users.
As the provisions of the Act are implemented, it’s essential for businesses to stay informed, invest in necessary changes and approach compliance as an opportunity to enhance their operations and user experience.
Tags: online safety, online safety act.
Heather Stark