Logo
Cover

Laws that enable censorship

On 25 March, a total of 61 organisations and associations called on MEPs in an open letter to reject a draft regulation that would oblige web hosts to delete terrorist content on the Internet within an hour. Organisations in the field of human rights and digital freedoms believe that the regulation would lead to preventive and automated censorship. The organisations describe the EU Anti-terrorism Regulation as a" serious threat “to citizens' freedoms.

Danger to freedom of expression and speech

Among the organizations signing the agreement are Amnesty International, the Consortium to Protect Journalists (CPJ), the International Federation for Human Rights (FIDH), the League for Human Rights (LDH), Reporters without Borders and Wikimedia.

“We urge the European Parliament to reject this proposal as it poses a serious threat to fundamental rights and freedoms, in particular freedom of expression, freedom of access to information, the right to privacy and the rule of law”.

For almost two and a half years, human rights groups and groups campaigning for digital freedoms have been warning against the draft European regulation “to prevent the spread of terrorist content on the Internet”. The regulation was originally proposed by the Commission on 12 September 2018, last December a compromise was drawn up, which is to be finally adopted on 28 April.

Compromise provides for exceptions

With the help of the regulation, the pressure on providers of Internet content is to be massively increased. It is aimed at any " provider of information society services that stores and disseminates information and content to the public.“The anti-terrorism Regulation would thus affect all content platforms, but also publicly accessible forums and blogs. If the regulation is adopted, the authorities of the EU member states can ask any database provider to remove critical content. If he does not comply with the request, he must pay a penalty, the amount of which can be up to four percent of his turnover.

In the negotiations on the compromise, the original wording was somewhat weakened and some exceptions were added, which the undersigned organizations acknowledge as “improvements compared to the first reading”. There should be exceptions, for example, for journalistic, educational or humorous content. A further improvement concerns the period during which a host must delete the relevant content: if a host does not have the operational capacity to delete the objected content within one hour, this must be taken into account by the authorities.

Of “proactive” to “specific” measures

The first version of the Anti-Terrorism Ordinance also required hosts to take” proactive measures " to remove terrorist content. A formulation that has been heavily criticized by opponents of the regulation, as it would have prompted platform operators to install automatic preventive filter systems. Therefore, the concept of “proactive measures” in the compromise proposal has been weakened and replaced by “specific measures”-which in turn is criticised by opponents of the regulation.

With this formulation, the authorities could still impose measures on the host, but they could no longer say which ones. On the other hand, the authorities would be obliged to monitor the situation regularly and would be likely to impose sanctions if the measures taken were considered inadequate. Thus, the regulation does not prescribe the use of content filters, but the affected would certainly resort to them – in order to remain on the safe side.

According to the authors of the open letter, the implementation of these measures will inevitably lead to a preventive censorship of legal content, “since it is impossible for automated programs to distinguish, for example, activism, counter-speech and satire about terrorism from content that is terrorist in nature.“This would mainly affect journalism, minorities and underrepresented and discriminated groups.

Many websites have automated filter warning systems, say the opponents of the regulation to French media. However, the reported results would then be checked by people. This human review cannot be implemented with a mandatory one-hour cancellation period. The only solution is to set up a fully automatic moderation of the content.

“Not to exclude political abuse”

Another major point of criticism is the responsibilities. According to the regulation, the content complaints and requests for cancellation should come from “competent authorities”, which must be designated in each member state of the EU and which do not necessarily have to be a court. These” competent authorities " should also be given the power to act in all EU member states.

The authors of the open letter therefore warn that this system could be abused for political purposes. After the adoption of the regulation, it would be possible for each state to issue injunctions. This would allow governments to manipulate or twist the procedure in such a way that it could be used against unwelcome groups such as political opponents, activists or migrants.

If the hosts of the objected content are located in a member state other than the authority that wants to delete an entry, it should be possible to check in the host’s home state whether the deletion request curtails the fundamental freedoms. If this is the case, the request for cancellation can be appealed. However, there is no obligation to do so. And the opponents point out that Ireland will hardly be in a hurry to examine and challenge deletion requests.

This” Regulation on terrorist content in its current form has no place in European law”, conclude the signatories of the open letter and call on " the members of the European Parliament to vote against the adoption of the proposal."