The Digital Services Act is tiptoeing towards regulatory failure

digital services act, DSA
© Antonio Guillem

Konstantinos Komaitis, Senior Director, Policy Development at the Internet Society argues that upload filters should not be part of the proposed DSA legislation

As the Digital Services Acts (DSA) package – Europe’s proposal regarding the roles and responsibilities of online platforms – is being passed  through the Brussels institutional machinery, the stakes are getting high. The danger of regulatory failure is real.

Much has happened since the DSA was introduced late last year. A draft progress report1 submitted by the Portugese Presidency of the European Council sheds light on the different visions member states have on the scope and enforceability of the DSA. But, a more concerning debate is unfolding in the European Parliament. Proposals for algorithmic tools to enable companies to address illegal content are being discussed for inclusion in the DSA package. In other words, upload filters.

Upload filters are automated decision-making tools that technology companies use to help moderate the massive amount of content that gets uploaded everyday onto their networks. They have  been criticised for their inefficiency2 and the private law enforcement3 capabilities they offer companies. They are tools that undermine the Internet.

Conversations around upload filters is a familiar one in Europe. During a highly contested debate last year, upload filters became a mandated tool in Europe’s controversial4 copyright reform. At the time, more than 150,000 people throughout the continent took to the streets with slogans such as “Save your Internet”; but, despite the protests, the copyright directive was still passed. Earlier this year, members of the European Parliament voted in favor of the Terrorist Content on Online Regulation (TERREG),which many civil society and technology organisations fear will incentivise online platforms to use automated content moderation tools.

Some members of the European Parliament now want the same for the DSA. Unlike the copyright directive, however, the DSA will be a horizontal legal instrument. Imagine an umbrella under which all existing and future European legislation related to content will be  hosted. What the GDPR is for privacy, the DSA will be for content. Upload filters should not be part of such a powerful legal tool.

Moreover and just like GDPR, the DSA aims to set a global standard for the responsibilities of online platforms. Upload filters cannot – and should not – be that global standard. This would expand the power of online platforms to create and use technology that controls the way content is moderated. Competition issues aside, mandated upload filters at that level of legislation will grant companies sweeping private enforcement powers.

It’s strange that Europe continues to insist on upload filters. In fact, Europe lacks any real evidence as to whether automated tools are actually effective in content moderation. The European Commission’s guidance document on the implementation of the Copyright Directive article 17 (upload filters) has yet to be published, a year after the directive was adopted. Additionally, the European Court of Justice – the block’s highest court – has postponed the publication of the Attorney’s General Øe’s Opinion on the challenge Poland brought before it on upload filters. According to the complaint, Poland seeks the annulment of the copyright directive asserting that it infringes the right to freedom of expression and  information guaranteed by Article 11 of the Charter of Fundamental Rights of the European Union.

Nobody said this would be easy. The DSA’s attempt to modernise the e-commerce directive, the backbone of Europe’s commitment to an open and interoperable Internet, is not a small feat. The Internet is far more complex than twenty years ago and the players have changed. The world is also not the same. The Internet and the services it supports is now being used as a weapon against states and users.

Succeeding in finding a good balance that recognises the complexity and diversity of the Internet, while recognising the changing nature of content and the weaponisation of its services is not easy. Especially if one looks at the way similar conversations are happening in the U.S. Section 230, the “big brother” of the e-commerce directive.  They are highly politicised and there is considerable concern that its potential repeal could shake the foundations of an open Internet in the United States.

The language in the Commission’s original proposal is far from perfect, but it did preserve what has worked consistently and predictably for the Internet since the early 2000s. It is both visionary and responsible. The Commission has delivered on its promise. It is critical that the European Parliament does the same and resist the emerging trend of some European countries, like France, Germany and Austria that have enacted laws that transfers enforcement to private companies through tools that are of questionable effectiveness. Upload filters are a quick fix to a much bigger, societal problem and should not be relied on as a solution.

Keep them out of the DSA.

References

  1. https://data.consilium.europa.eu/doc/document/ST-8415-2021-INIT/en/pdf
  2. https://www.techdirt.com/articles/20181214/17272041233/youtubes-100-million-upload-filter-failures-demonstrate-what-disaster-article-13-will-be-internet.shtml
  3. https://www.liberties.eu/en/stories/uploa-filter-back-eu-2020/18938
  4. https://www.greeneuropeanjournal.eu/link-taxes-and-upload-filters-will-not-fix-the-internet/
Call 116 123 to speak to a Samaritan

LEAVE A REPLY

Please enter your comment!
Please enter your name here