Fight against illegal content – with care
While the trialogue of the e-privacy regulation: replacing the e-privacy directive is going on between the European Commission and the co-legislators, i.e. the Council and the European Parliament, the Regulation passed recently allows to deviate from the existing directive.
The purpose is to enable the platforms to detect and report online sexual abuse and remove content showing these illegal activities. They can report also to non-governmental organisations protecting children. For this they need to monitor the activities of users in a way monitoring general activities is not permitted. The Regulation covers not only social networks but all “number-independent personal communication service providers”, i.e. non-phone messengers and chats as well.
The new rules are found controversial, among others by liberal and centre-left MEPs: according to them it contradicts the General Data Protection Regulation, it infringes the privacy rights of individuals. Their concern is mainly that activities can also be filtered, which are not illegal (like grooming).
Javier Zarzalejos from the EPP called the same a ground-breaking achievement. By the way the act will only be in force for three years but privacy activists already worry what the long term rules replacing it will be. They are mainly worried due to the weakening of encrypted communications. On the other side are activists protecting the children’s rights - their challenge is, as formulated to Politico by John Carr, secretary of UK Children’s Charities’ Coalition on Internet Safety, that they have to convince data protection activists that this is not a theoretical issue.
In contrast, the German Federal Court obliged Facebook to reinstate two anti-refugee comments deleted in 2018 due to their racist content. The court also prohibited to delete them again or to block again the commenters. Of course the judges did not side with the racists but the procedure was found unfair. In this sense the court thus tried to balance its own way the obligations of platforms to remove illegal content and the limitation of their excess power: Facebook namely didn’t give the legitimately justified possibility to be notified and to present their opinion about the justification of the removal of their content.
The Germans take compliance with the procedural guarantees even in the case of otherwise justified sanctions very seriously: another data protection fine was also annulled in spite of their legal basis being the GDPR as it infringed general procedural rules for authority fines. It is clear that this strictness is even more justified in the case of private players to avoid that they acquire uncontrolled power under the pretext of protecting public interest.
Another controversial topic is the monitoring and removal of content infringing copyright from YouTube and similar video and file sharing platforms. The providers cannot be expected to check every piece of content individually, whether it does not infringe someone’s copyright, nevertheless, when they become aware of the infringement, they have to act.
If they, however, not just displaying the content but participate actively in making it available for the users, then they can be condemned for simply displaying illegal content.
This was recently made clear by the European Court of Justice. In another court case, in which Poland asked for the invalidation of the obligation of platforms in the 2019 copyright directive to monitor actively whether their users do not illegitimately upload copyrighted material, only the opinion of the Advocate General is available. The author is prominent: Saugmandsgard Øe formulated the opinion also in the Schrems-II case.
The Advocate General emphasises in his present opinion that on one hand in certain cases the platforms are relieved from responsibility and on the other hand, the legislator built into the act appropriate safeguards. Besides that it is not obligatory to remove all content preventively, only that is identical or equivalent to the protected subject matter identified by the right-holders, and in ambiguous cases, for example in the case of short extracts or “transformative” works, content will only be removed on substantiated requests or on a court order. Therefore – contrary to what the Polish government stated – the stipulation does not restrict disproportionally or in an unjustified manner the freedom of speech.
According to the opinion, the platforms will not be judges of the legitimacy of contents and are not obliged to monitor everything. It also notes, however, that it would not be sufficient if the platforms would reinstate these contents only following the complaint or appeal of the user. If this opinion will become (unchanged or with a different content) a judgment, this can give important guidance concerning the conflict around the obligations and powers of platforms.
The basic conflict between freedoms and security is not new, but new technologies enhance the contradiction. What is more, the influence of big providers on the internet on what we see is so big, that it is impossible not to oblige them to take social responsibility. This is what the new Digital Services Act does, setting special rules for so-called “gatekeeper” platforms – officially called very large platforms (which are defined in the Regulation as platforms reaching 10% of the population of the EU).At the same time there is the endeavour that technological development and the wide use of technologies should not be hindered by bureaucratic rules.
The political dimension of the problem is well illustrated by the position of the European Greens, analysed by Votewatch based on the votes of its MEPs: while they are favouring more regulation than more market-oriented parties (like the EPP and part of Renew), they are against regulation of the Internet and side, together with a significant part of EPP, with the freedom of the net.
The fate of the e-privacy regulation mentioned earlier also indicates that the question is not simple – although that Regulation only concerns a special area: the Council arrived to a consensus after several years only in February and, although, according to some news, an agreement is close, discussions and waves are not probable to be smoothed soon.
László S. Szabó, Szabó Consulting