June 18
2021
Anjum Shabbir
Anjum Shabbir
share
31st May 2021
Data, Tech & IP Human Rights

Op-Ed: “The EU’s Digital Services Act: are we still free to conduct business?” by Ben Allgrove

Article 16 of the Charter of Fundamental Rights of the EU (‘the Charter’) creates in EU law the freedom to conduct a business. On its face the proposed Digital Services Act (‘DSA’) includes provisions which inevitably restrict the freedom to conduct a business for impacted companies. That is the express intention of the proposal. This Op-Ed explores whether the restrictions proposed go beyond what is proportionate to achieve the stated policy objectives. If they are, they would be incompatible with Article 16; a topic which is receiving surprisingly little airtime in discussions to-date both about the DSA and technology regulation in general.

The policy and legal context

The DSA seeks to contribute to the proper functioning of the internal market for intermediary services; and to set out uniform rules for a safe, predictable and trusted online environment, where fundamental rights enshrined in the Charter are effectively protected (Article 1(2)). These objectives are further elaborated on in the recitals – in particular recitals 2 and 4 (preventing fragmentation of the internal market and providing conditions for innovative digital services to emerge and to scale up’; recitals 3, 12 and 41 (achieving a safe online space and protecting fundamental Charter freedoms); recitals 5 and 25 (addressing online harms); and recital 39 (ensuring transparency and accountability).

The freedom to conduct business is a fundamental right; it encompasses the freedom to exercise an economic or commercial activity; the freedom of contract and free competition (Explanations relating to the Charter of Fundamental Rights), including in particular the right to:

  • be able to freely use, within the limits of its liability for its own acts, the economic, technical and financial resources available to it (UPC Telekabel (C-314/12)); and to
  • determine the terms on which it offers its service (Sky Osterreich (C-283/11)).

Given the DSA would implement an expansive new regulatory regime on intermediary services, it necessarily impacts the freedom of providers of intermediary services to conduct their business. This is not problematic per se: the freedom to conduct a business is not absolute. However, ‘where several rights and fundamental freedoms protected by the European Union legal order are at issue, the assessment of the possible disproportionate nature of a provision of European Union law must be carried out with a view to reconciling the requirements of the protection of those different rights and freedoms and a fair balance between them’. Sky Osterreich, paragraph 60). In particular, any restrictive measures must individually satisfy the principle of proportionality (Article 52(1) Charter). Any restrictive measures must fulfill the following cumulative conditions:

  • be necessary;
  • be effective, that is, genuinely meet the general interest policy objective(s) pursued;
  • respect the essence of the freedom restricted;
  • be the least restrictive choice available to achieve the policy objective(s) pursued; and
  • not have disadvantages disproportionate to the public policy objective(s) pursued.

Do the restrictions proposed in the DSA meet this test?

The Explanatory Memorandum (in Part 3) takes the view that service providers’ freedom to conduct business is not infringed by the DSA because ‘the costs incurred on businesses are offset by reducing fragmentation across the internal market’ and also because ‘certain obligations are targeted to very large online platforms, where the most serious risks often occur and which have the capacity absorb the additional burden’. However, contrary to the implicit ‘in the round’ approach inherent in this statement, EU law actually requires that any restrictive measures must individually satisfy the principle of proportionality. There should therefore at least be a discussion during the legislative process as to whether or not each of these restrictions on the freedom to conduct business can lawfully be justified. If there is not, the necessary consequence is that the legislature will be giving primacy to select fundamental freedoms, rather than balancing them. We turn now to look at examples which merit this balancing discussion:

Risk Mitigation (Articles 27, 28 and 35)

The DSA:

  • Enables the European Board for Digital Services, in cooperation with the Commission, to issue ‘best practices’ for very large online platforms (VLOPs to mitigate systemic risks identified (Article 27(2)(b));
  • Enables the Commission, in cooperation with Digital Services Coordinators, to issue ‘general guidelines’ to present ‘best practices’ and ‘recommend’ possible mitigation measures for VLOPs to address specific risks (Article 27(3));
  • Requires VLOPs to design risk mitigation measures with the involvement of representatives of users, representatives of groups potentially impacted by their services, independent experts and civil society organisations (recital 59);
  • Requires VLOPs to ‘take due account of any operational recommendations’ third-party auditors address to them, with a view to take the ‘necessary measures to implement them’ (Article 28(4)); and
  • Requires VLOPs to participate in semi-mandatory (recital 68) Codes of Conduct that set out ‘commitments to take specific risk mitigation measures’ (Article 35(2)).

The DSA’s prescriptive approach to the risk mitigation measures is prima facie inconsistent with the Court of Justice’s clear guidance that the very essence of the freedom to conduct business is that a company be free to determine how to comply with legal obligations in the context of its particular, individual business (UPC Telekabel, paragraph 49; similarly Coty Germany (C-580/13); Breyer (C-582/14); and McFadden (C-484/14)). The DSA seeks to allow regulators to impose, directly or indirectly, specific risk mitigation solutions. If VLOPs have a fundamental right to freedom to conduct business, inherent in that right is the right to design and implement effective risk mitigation strategies that are aligned with their specific business models and needs, so long as these address the relevant identified systemic risks being targeted. The DSA’s approach to regulation marks a notable escalation in the EU’s attempts to prescriptively regulate the technology market and depart from this principle of EU law.

Dispute Settlement (Articles 17 and 18)

Article 17 requires online platform service providers (OPSPs) to set up an internal complaint-handling system for users to have recourse to in order to complain about content removals they do not agree with, and to ensure that any decisions they reach on such user complaints ‘are not solely taken on the basis of automated means’ (Article 17(5)). This requirement would result in a large logistical effort and significant costs for many businesses, potentially impacting the very essence of the freedom to conduct business, without a stated justification and despite the fact that users have a number of redress options, including recourse to courts and regulators (Article 43).

Article 18 then introduces an out-of-court (OOC) mechanism, whereby users of OPSPs may contest in an OOC forum ‘content moderation decisions’. In its current form, Article 18 does not even purport to reach a balance between protecting users’ freedom of expression and OPSPs’ freedom to conduct business. The requirement for OPSPs to submit to a binding OOC mechanism for all kinds of content moderation decisions, and to incur associated costs notwithstanding how unmeritorious the complaint or the outcome of the process, is on its face disproportionate. There are obviously less restrictive means to achieve the stated policy objective, for example by introducing a robust internal appeals mechanism and subjecting it to regulatory oversight, or by allowing users to complain to regulators, which is indeed already mandated as an alternative under Article 17. Ironically, the proposal also potentially negatively impacts users’ exercise of their right to freedom of expression by removing the fundamental constitutional task of balancing competing fundamental freedoms from the court system.

Transparency Reporting (Articles 13, 23 and 33)

The public transparency reporting requirements imposed on intermediary service providers (ISPs) are substantial and, taken in aggregate, raise questions of proportionality in the context of ensuring an adequate level of transparency and accountability. For example:

The provisions conflate (i) public transparency for the purposes of ensuring users are sufficiently informed of the terms on which an ISP is offering a service with (ii) the information that a regulator may need to exercise a regulatory jurisdiction.

Moreover, the provisions impose requirements that do not seem necessary to achieve the stated policy objective. For example, Article 13(a) requires intermediaries to include information on the number of orders they receive from Member States, despite the fact that Member States are also required to publish that information directly under Article 44(2)(a).

Statement of Reasons (Article 15)

The requirement imposed on hosting service providers (HSPs) to give users individualised reasons on why content has been removed ‘at the latest at the time of the removal’ is not  always going to be an effective means to meet the DSA policy objectives to allow users’ exercise of their freedom of expression and information, and to ensure an adequate level of transparency on the part of HSPs:

  • This requirement will logically slow down HSPs’ content moderation processes. Would it be a good policy result if the time and cost spent by the HSPs on giving reasons why content is taken down exceeds the time and cost spent by the HSPs on actually identifying and taking that content down?
  • Additionally, this requirement fails to consider that providing granular reasons in all instances may actually increase the incidence of ‘bad’ content on the internet; a known issue for anyone advising in this space. This is because providing granular reasons might allow bad actors to circumvent established content moderation processes.

A blunt requirement to give users granular reasons on why content has been removed is also not the least restrictive choice to achieve the DSA objectives pursued. Moreover, why is the requirement in Article 15(4) to publish all removal decisions in a publicly accessible database necessary or appropriate to achieve transparency and accountability? This treats private, commercial decision-making as a quasi-judicial function. Is this what we want? Would not the same outcome be achieved by requiring HSPs to keep a record of removal decisions and associated statements of reasons for evidential and regulatory purposes? Article 15(4) as drafted is akin to requiring data protection impact assessments required under the GDPR to be publicly published.

Conclusion

In his Opinion for the case Glawischnig-Piesczek v Facebook (C-18/18) Advocate General Szpunar analysed the question of whether the Article 16 rights had been breached as follows: ‘Seeking and identifying information identical to that which has been characterised as illegal by a court seised does not require sophisticated techniques that might represent an extraordinary burden. Such an obligation therefore does not appear to entail an excessive breach of the right to freedom to conduct a business which a host provider operating a social network platform … enjoys under article 16 of the Charter of Fundamental Rights of the European Union’ (emphasis added) The DSA is different. It does require sophisticated techniques to be deployed. It does require the deployment of very substantial resources so as to constitute an extraordinary burden on service providers. And it arguably does so without striking a fair balance between the right of the service provider to conduct business with the other policy objectives that the DSA seeks to achieve. Should we not be talking about that?

 

Ben Allgrove is a copyright, privacy, AI and wider technology expert with an emphasis on digital media and intermediary platforms. The views in this article are the author’s alone. In the interests of transparency, the author discloses that he is a frequent advisor on technology regulation to companies that would be impacted by the DSA.

 

×

Your privacy is important for us

We use cookies to improve the user experience. Please review privacy preferences.

Accept all Settings

Check our privacy policy and cookies policy.

Cookies