The Digital Services Act aims to better protect users in the online world, but its requirements will impose many new obligations on service providers, say Mary Loughney, Shane O’Neill and Filipa Sequeira
The increased use of digital technology dramatically raises the chances of end users being exposed to illegal or harmful online content. Regulations and laws are catching up with the fast-paced world of emerging digital services and online platforms to ensure online services’ security, accountability and openness.
The Digital Services Act (DSA), an EU regulation, aims to modernise the digital landscape and defend users’ rights.
What digital services does the DSA cover?
The DSA encompasses a broad range of online intermediaries, including internet service providers, cloud services, messaging platforms, marketplaces and social networks.
Hosting services, such as online platforms (a hosting service provider that “stores and disseminates to the public information, unless that activity is a minor or purely secondary feature of another service”), social networks, content-sharing platforms, online marketplaces and travel/accommodation platforms, have specific due diligence obligations.
The DSA’s most significant regulations target very large online platforms, with a substantial societal and economic impact reaching a minimum of 45 million EU users, representing 10 percent of the population.
Similarly, very large online search engines with over 10 percent of the EU’s 450 million consumers will have greater responsibility for combating illegal content on the internet.
Key provisions of the DSA
The DSA outlines specific responsibilities for online platforms, including big platforms, intermediaries and hosting service providers.
Due to their significant societal impact, the Act introduces categories called Very Large Online Platforms (VLOP) and Very Large Online Search Engines (VLOSE), which are subject to stricter regulations and audit requirements.
An independent audit must cover all the obligations imposed on VLOPs and VLOSEs by the DSA, including the duties to remove illegal content, provide users with transparency about how their data is used and prevent the spread of disinformation.
The following focus areas are central to the DSA’s requirements:
- Due diligence around safety and content moderation: The DSA lays out guidelines to address illegal content, such as hate speech, terrorist propaganda and fake goods. Online platforms must set up efficient content moderation systems and offer ways for users to report unlawful content. This may involve using automated tools for detection and removal.
- User rights and transparency about terms of service, consent, algorithms and advertising practices: Companies must offer more transparency about how their platforms operate, including their terms of service, algorithms and advertising practices. This will help users to understand how their data is being used.
- Users’ ability to control their privacy settings and flag harmful content: Companies must provide users with tools to manage their privacy settings and flag harmful content. This will help users to protect their personal data and keep themselves safe online. Companies are also required to respond to flagged content within a reasonable timeframe.
- Measures to prevent the spread of disinformation: Companies must take steps to prevent the spread of disinformation, such as by labelling sponsored content and providing users with access to reliable information. This may involve working with fact-checking organisations or other companies to share information about disinformation.
- Accountability for the content hosted on platforms: Companies must be accountable for the content hosted on their platforms. This means they must be able to remove illegal content promptly and co-operate with law enforcement authorities.
With these provisions in mind, a sensible place to begin your journey may involve conducting a maturity assessment using a risk-based approach so the organisation is aware of the risks that require mitigation:
- Maturity assessment: The risk assessment should consider a range of factors, such as the nature of the platform, the type of content hosted and the potential for harm to users.
- Address DSA requirement gaps: As a result of the risk assessment, organisations should identify their exposed risks and implement necessary measures, which include enhancing content moderation tooling, increasing transparency and enabling more robust end-user control mechanisms.
- Compliance reporting: Organisations would be required to comply with third-party external audits. While that audit would evaluate the platform’s systems and processes, compliance reporting may also include information on overall risk mitigation efforts.
The challenging aspects of the DSA’s audit requirements
To ensure compliance with the DSA’s provisions, digital service providers, predominantly VLOPs and VLOSEs, will be subject to independent audits. The audit must be conducted in accordance with the methodology and templates established in the delegated regulation, and the audit should review whether the VLOP or VLOSE:
- has a clear and transparent policy on how it addresses illegal content;
- has a system in place for detecting and removing illegal content and preventing the spread of disinformation; and
- provides users with adequate transparency about how their data is used.
The audits will evaluate the platform’s efforts to deal with illegal content, the openness of content moderation procedures, adherence to DSA requirements, and the efficiency of user reporting mechanisms. The platform’s practices for data security and privacy will also be examined.
It will be challenging for online intermediaries to comply with some DSA requirements.
Accurate classification of digital services
The DSA distinguishes between different types of digital services, such as intermediaries, hosting services and online platforms. Assigning the correct classification to a specific service can be complex, especially for hybrid platforms with multiple functionalities. Accurately defining the obligations and responsibilities associated with each classification requires careful analysis.
Removing illegal content in a timely manner
The DSA requires the removal of unlawful content in a timely manner after being made aware of its existence. Implementing effective content moderation mechanisms while respecting freedom of expression and avoiding over-removal or under-removal of content is a complex task. Developing sophisticated algorithms and human review processes to strike the right balance poses significant technical and operational challenges.
Further transparency about how content is moderated
The DSA requires more transparency about how online intermediaries moderate content. This includes providing information about the criteria used to moderate content, the processes used to make decisions and the appeals process available to users who flag moderation issues.
It can be difficult to require online intermediaries to disclose sensitive information about their internal operations.
Additional steps to protect users’ privacy rights
The DSA requires additional steps to protect users’ privacy and enhance users’ rights. This includes transparency, user control over content and redress mechanisms.
These new provisions can be challenging to implement as they require online intermediaries to change their business practices significantly.
Implementing user-friendly interfaces and operative-complaint resolution mechanisms to ensure seamless user experiences can be technically complex and resource intensive.
Compliance with new rules on targeted advertising
The DSA introduces new rules on targeted advertising. These rules prohibit online intermediaries from using sensitive personal data to target users with ads, and they require online intermediaries to give users more control over the ads they see.
Co-operation with authorities
The DSA emphasises co-operation between platforms and regulatory authorities.
Ensuring information sharing, responding to legitimate requests and establishing effective communication channels with various national authorities across the EU pose many challenges. Maintaining confidentiality and data protection while complying with these requirements can be tricky.
Interpretation of the DSA
The interpretation of the DSA may evolve as it undergoes the legislative process. As such, there are themes associated with how one might expect an audit will be conducted:
- Transparency: The audits must be conducted transparently.
- Accountability: The audits are designed to ensure that VLOPs and VLOSEs are accountable for compliance with the DSA.
- Effectiveness: The audits must effectively identify and address any compliance gaps.
- Proportionality: The audits must be proportionate to the size and complexity of the VLOPS and VLOSEs.
- Flexibility: The delegated regulation allows auditors to adapt the audit methodology to the specific circumstances of the VLOP or the VLOSE.
These are just some specific requirements that are tricky and complicated to implement. However, the DSA is essential to creating a safer and more accountable online environment.
Best practice
The table above displays exemplary and tactical actions that could be considered when enhancing users’ privacy rights and transparency about terms of service, consent, algorithms and advertising practices.
In addition to these specific steps, companies should consider implementing several general best practices:
- A well-defined risk management framework: Establishing ongoing risk assessment activities will help companies identify and mitigate user risks.
- A culture of compliance: This will help ensure that all stakeholders are aware of the DSA requirements and committed to complying with them.
- A robust process for responding to incidents: This will help companies to respond quickly and effectively to any incidents that may arise.
- An oversight process for monitoring and reporting on compliance: This will help companies track their progress and identify areas where they may need to improve.
A trustworthy online environment
The DSA represents a significant step toward regulating online platforms and digital services within the EU.
By introducing audit requirements, the DSA enhances transparency, accountability and user protection in the digital world. Independent audits will serve as a mechanism to ensure compliance with the DSA’s provisions, thereby fostering a safer, fairer and more trustworthy online environment.
Mary Loughney is Director and Head of Technology Risk Consulting at Grant Thornton
Shane O’Neill is Partner and Head of Technical Change, Financial Services Advisory at Grant Thornton
Filipa Sequeira is Senior Consultant of Financial Services Advisory at Grant Thornton