EU DSA is Comming - How to implement a regional law in multiple jurisdictions?

Published on
May 3, 2022

Authored by Frane Maroevic, Director of the Content & Jurisdiction Program, Internet & Jurisdiction Policy Network


On Friday, April 22, 2022 the EU Council and Parliament reached a political agreement on the Digital Services Act (DSA). This was a key milestone in the process of adoption of this Regulation which will now have to be formally adopted by the Parliament and the Council possibly still in 2022. For those who follow the work of the EU, this has been an impressively fast process. 

The DSA will apply to digital/online services offered in the EU and thus have a direct effect on the 450 million people living in the EU. But as we have seen with many other EU laws and standards, such as the General Data Protection Regulation (GDPR), this Regulation is also likely to become a global standard and possibly a template for other online regulations.

Since the announcement of the political agreement, there has been a flurry of analysis and comments on the DSA, what is good, what is bad, what should have been done differently, and what still remains to be done. From the perspective of the work that has been ongoing in the Internet & Jurisdiction Policy Network, there are 2 very complex issues that will need to be addressed in the implementation of the DSA. The first is the territorial scope of the orders to remove illegal content and the second is the role of transparency.

One of the most complicated issues is in Article 8 of the DSA. It introduces a requirement for legal or regulatory "Orders to act against illegal content" to specify "the territorial scope of the order". How to determine the appropriate territorial scope of online restrictions has been one of the key topics discussed in the Internet & Jurisdiction Content Contact Group.

Restricting content in other jurisdictions is a very severe measure and needs to be reserved for very exceptional cases. An obvious example is child abuse or CSAM material where there is broad agreement across the globe that this is illegal and needs to be quickly removed. However, when looking at other categories of content, things become very quickly complicated. For example, while there is agreement that terrorist content is illegal, there are differences across the globe in what constitutes terrorist content. Even within the EU there are differences in the illegality of blasphemous content, differences in standards for defamation, displaying nazi symbols is illegal in some, but not in other states and so on. It is clear that there will be many complex cases. 

What are the principles or standards that will be used by the different institutions when determining the territorial scope of online restrictions that go beyond their own jurisdiction? How will they ensure coherence? How will they ensure that they are not limiting the right to access information in other countries? 

In our work within the Content & Jurisdiction Program, we have identified 3 key areas to consider: location, harm and normative convergence. Each of these areas can be further subdivided. For the location, where the illegal content was uploaded from, the location of the victim and the location of the internet intermediary. Harm can be about the likelihood, imminence and geographic reach of the harm, about the audience and geographic prevalence of the account where the post originated from. Harm needs to be also looked at from the perspective of the potential burden on the victim(s) to obtain redress.

Normative convergence is the assessment of whether something is illegal in multiple jurisdictions and how similar are the standards for determining illegality. It is also about the coherence of national laws with international human rights law and jurisprudence. These 3 areas can be a starting point for further multistakeholder dialogue on the very rare exceptions of cross-border content restrictions.

The other area where additional multistakeholder consultations will be helpful for effective implementation will be transparency. Especially as it is likely that the DSA will set not just EU, but global standards for transparency reporting by internet intermediaries. For it to serve the needs of all stakeholders it will need to include all of them in the implementation mechanisms. An inclusive multistakeholder process should also move the debate from transparency as something imposed, or an obligation, to transparency being recognised as a benefit for all.

The Content & Jurisdiction Program Contact Group will shortly publish a framing brief outlining the issues that can be helpful for transparency regimes to produce meaningful results for all stakeholders. More information about the Content & Jurisdiction Program Contact Group can be found here.