Fight against illegal content – with care

The same day when debating the Hungarian anti-paedophilia law, the European Parliament also gave green light to a regulation on measures against online abuse of minors. The data protection compliance of this regulation is hotly debated, and a German court simultaneously condemned the removal of other, similarly illegal – racist – content. This points to a contradiction: while trying to limit the excess power of portals and other Internet platforms, we also want to make them responsible to put their house in order – even disciplining their users.

While the trialogue of the e-privacy regulation: replacing the e-privacy directive is going on between the European Commission and the co-legislators, i.e. the Council and the European Parliament, the Regulation passed recently allows to deviate from the existing directive.

The purpose is to enable the platforms to detect and report online sexual abuse and remove content showing these illegal activities. They can report also to non-governmental organisations protecting children. For this they need to monitor the activities of users in a way monitoring general activities is not permitted. The Regulation covers not only social networks but all “number-independent personal communication service providers”, i.e. non-phone messengers and chats as well.

The new rules are found controversial, among others by liberal and centre-left MEPs: according to them it contradicts the General Data Protection Regulation, it infringes the privacy rights of individuals. Their concern is mainly that activities can also be filtered, which are not illegal (like grooming).

Javier Zarzalejos from the EPP called the same a ground-breaking achievement. By the way the act will only be in force for three years but privacy activists already worry what the long term rules replacing it will be. They are mainly worried due to the weakening of encrypted communications. On the other side are activists protecting the children’s rights - their challenge is, as formulated to Politico by John Carr, secretary of UK Children’s Charities’ Coalition on Internet Safety, that they have to convince data protection activists that this is not a theoretical issue.

In contrast, the German Federal Court obliged Facebook to reinstate two anti-refugee comments deleted in 2018 due to their racist content. The court also prohibited to delete them again or to block again the commenters. Of course the judges did not side with the racists but the procedure was found unfair. In this sense the court thus tried to balance its own way the obligations of platforms to remove illegal content and the limitation of their excess power: Facebook namely didn’t give the legitimately justified possibility to be notified and to present their opinion about the justification of the removal of their content.

The Germans take compliance with the procedural guarantees even in the case of otherwise justified sanctions very seriously: another data protection fine was also annulled in spite of their legal basis being the GDPR as it infringed general procedural rules for authority fines. It is clear that this strictness is even more justified in the case of private players to avoid that they acquire uncontrolled power under the pretext of protecting public interest.

Another controversial topic is the monitoring and removal of content infringing copyright from YouTube and similar video and file sharing platforms. The providers cannot be expected to check every piece of content individually, whether it does not infringe someone’s copyright, nevertheless, when they become aware of the infringement, they have to act.

If they, however, not just displaying the content but participate actively in making it available for the users, then they can be condemned for simply displaying illegal content.

This was recently made clear by the European Court of Justice. In another court case, in which Poland asked for the invalidation of the obligation of platforms in the 2019 copyright directive to monitor actively whether their users do not illegitimately upload copyrighted material, only the opinion of the Advocate General is available. The author is prominent: Saugmandsgard Øe formulated the opinion also in the Schrems-II case.

The Advocate General emphasises in his present opinion that on one hand in certain cases the platforms are relieved from responsibility and on the other hand, the legislator built into the act appropriate safeguards. Besides that it is not obligatory to remove all content preventively, only that is identical or equivalent to the protected subject matter identified by the right-holders, and in ambiguous cases, for example in the case of short extracts or “transformative” works, content will only be removed on substantiated requests or on a court order. Therefore – contrary to what the Polish government stated – the stipulation does not restrict disproportionally or in an unjustified manner the freedom of speech.

According to the opinion, the platforms will not be judges of the legitimacy of contents and are not obliged to monitor everything. It also notes, however, that it would not be sufficient if the platforms would reinstate these contents only following the complaint or appeal of the user. If this opinion will become (unchanged or with a different content) a judgment, this can give important guidance concerning the conflict around the obligations and powers of platforms.

The basic conflict between freedoms and security is not new, but new technologies enhance the contradiction. What is more, the influence of big providers on the internet on what we see is so big, that it is impossible not to oblige them to take social responsibility. This is what the new Digital Services Act does, setting special rules for so-called “gatekeeper” platforms – officially called very large platforms (which are defined in the Regulation as platforms reaching 10% of the population of the EU).At the same time there is the endeavour that technological development and the wide use of technologies should not be hindered by bureaucratic rules.

The political dimension of the problem is well illustrated by the position of the European Greens, analysed by Votewatch based on the votes of its MEPs: while they are favouring more regulation than more market-oriented parties (like the EPP and part of Renew), they are against regulation of the Internet and side, together with a significant part of EPP, with the freedom of the net.

The fate of the e-privacy regulation mentioned earlier also indicates that the question is not simple – although that Regulation only concerns a special area: the Council arrived to a consensus after several years only in February and, although, according to some news, an agreement is close, discussions and waves are not probable to be smoothed soon.

László S. Szabó, Szabó Consulting

 

More in Business

Richter épulet 2024
November 12, 2024 10:35

Unstoppable growth at Richter, but there's a twist in the tale

Third-quarter flash report came in

liberty steel dunaújváros sanjeev gupta
November 11, 2024 10:46

Liberty Steel reacts to the Economy Ministry's ultimatum, ironworks on a downward slope

As the situation deteriorates, so the messaging between the parties grows more acrimonious

Richter épulet 2024
November 08, 2024 16:01

Hungary Richter set to release Q3 earnings report - What to expect?

Analyst consensus available

otp
November 07, 2024 09:50

OTP within shooting distance of one thousand billion forints in annual profit

The third-quarter flash report is expected for today

akkugyár akkumulátor gyár elektromos autó
November 06, 2024 12:31

CATL starts battery production in Hungary

Even though the gigantic plant is not yet complete

October 31, 2024 12:02

Vodafone and Digi sign MoU to acquire Telekom Romania

Memorandum of Understanding signed

LATEST NEWS

Detailed search