Austria: Federal Act enacting a Federal Act on Measures to Protect Users on Communication Platforms

Richard Wingfield
10 min readSep 14, 2020

In recent years, it has been European governments who have taken some of the first steps towards regulating online platforms and, specifically, some of the different types of illegal and harmful content that can be found on them. The German NetzDG is, perhaps, the most well-known example, but the government of Austria published its own proposals for new legislation at the start of September. (Thank you to Thomas Lohninger, Executive Director of the Austrian digital rights organisation epicenter.works for raising awareness of this proposal; their own initial assessment of the draft law is well worth a read, as is Matthias Kettemann’s thread on Twitter about the proposals).

The full title of the proposed legislation is the Bundesgesetz über Maßnahmen zum Schutz der Nutzer auf Kommunikationsplattformen (the Federal Act on Measures to Protect Users on Communication Platforms). In short, however, the legislation would be known as the Kommunikationsplattformen-Gesetz (Communication Platforms Act) or the KoPl-G. In this blog post, I want to take a look at the KoPl-G and set out some first thoughts on what it would mean from a human rights perspective. The proposals are still out for consultation, and may be amended or even withdrawn. If this happens, I shall update this blogpost noting the changes.

If you want to read the KoPl-G (in English), I’ve put together a Google Doc with the official translation here.

What would the KoPl-G do?

The KoPl-G would impose new obligations on online platforms of a certain size relating to the reporting, review and removal of certain types of illegal online content, as well as the provision of associated transparency reporting. These new obligations would be accompanied by oversight and enforcement undertaken by KommAustria, the Austrian regulatory body for broadcasting and audiovisual media, with sanctions imposed for non-compliance. After summarising the scope of the KoPl-G (both the companies it would apply to, and the types of illegal online content) I will look at each of these elements in turn.

1. Scope

The rules on which companies are in scope are set out in § 1, namely any “communication platform” which either has more than 100,000 registered users or an annual turnover of more than 500,000 EUR. The definition of a “communication platform” is complex, but in essence it’s any online service, the main purpose or an essential function of which is to enable the exchange of messages or content between users and a larger group of other users by way of mass dissemination.

There are some limited exceptions to the general rules on scope: platforms used to broker or sell goods or services (e.g. Amazon) are exempt, as are non-profit online encyclopedias (e.g. Wikipedia) and media companies if the communication platforms directly relate to journalistic content (e.g. the comments sections of online newspapers). Where there is a dispute over whether a company is within scope, KommAustria will make the final determination.

Under § 5, a company within scope must appoint a person (the “responsible representative”) who has the authority to issue orders required to comply with the law They must also be a German speaker. The contact details of the responsible representative must be provided to KommAustria, and they must be available to KommAustria “at any time”.

The rules on what content is in scope are set out in § 2, and the scope extends to any content which constitutes one a long list of criminal offences under Austrian law. The content must also be “objectively” illegal and “not justified”. The list of offences comprises:

  • Coercion
  • Dangerous threats
  • Persistent persecution
  • Ongoing harassment by means of telecommunications
  • Accusation of a judicial criminal act that has already been dismissed
  • Insult
  • Unauthorised recordings
  • Blackmail
  • Disparagement of religious teachings
  • Pornographic representations of minors
  • Initiation of sexual contact with minors
  • Terrorists organisation
  • Instructions for committing a terrorist offence
  • Encouragement to commit terrorist offences and approval of terrorist offences
  • incitement to hatred

2. Reporting and review procedures

§ 3 sets out the obligations on companies in scope when it comes to enabling reporting and consequent review of the illegal content within scope. First and foremost, they must set up “an effective and transparent procedure for dealing with and processing reports on allegedly illegal content” available on the platform”. This “reporting procedure” must enable a user to be able to report content, to receive an explanation of how their report will be dealt with, and to receive a response setting out how the report was dealt with including reasons for the decision. Such information must also be provided to the user who uploaded the content.

When it comes to making a decision over whether to remove the reported content or not, the company must remove or block access to it within 24 hours if “its illegality is already evident to a legal layperson without further investigation”. If further examination is required, the company has up to 7 days to make a determination and then remove or block access to it if it is found to be illegal.

The company must also make both of the users aware of the possibility of requesting a review of the company’s decision (the “review procedure”). A review procedure must take place if (a) the company decides to keep the content up, and the user who reported the content requests a review within two weeks, or (b) the company made a decision to block or remove access to the content, and the user who uploaded the content requests a review within two weeks. The review procedure must be completed within two weeks of a request.

The only exception to the obligation to carry out a reporting procedure or a review procedure is if, “due to the type or frequency of the reports received, [the company] can assume with a probability bordering on certainty that the reports were either automated or otherwise initiated in an abusive manner”.

3. Transparency reports

The requirements in relation to transparency are set out in § 4. Transparency reports must be produced annually (or, in the case of communication platforms with over one million registered users, quarterly). They must be submitted to KommAustria within one month of the end of the calendar year, and published on their own website. The reports must contain the following:

  • General information on the efforts the company take to prevent illegal content on its platform;
  • A description of the reporting procedures, the criteria for making decisions on whether to remove or block access to illegal content, and the steps taken to make determinations;
  • The number of reports of of alleged illegal content received in the period, and the number of reports that led to the deletion or blocking of content;
  • An overview of the number, content and results of the review procedures;
  • A description of the organisation, personnel and technical equipment, technical competence of the staff responsible for processing reports and review procedures, as well as the education, training and supervision of the persons responsible for processing reports and reviews;
  • An overview of the length of time between receiving a report, the start of the review, and the deletion or blocking of the content. These must be broken down into periods of “within 24 hours”, “within 72 hours”, “within 7 days” and “at a later point in time”; and
  • An overview of the number and type of cases in which the company refrained from carrying out a reporting or review procedure.

KommAustria will be able to provide more detailed provisions on the structure of transparency reports and the scope of the reporting obligations.

4. Enforcement and sanctions

As noted above, enforcement is the responsibility of KommAustria, and the various responsibilities and means of supervision are set out in §7 and §9.

§7 provides for a “complaints procedure” by which users can complain about the inadequacy of a company’s reporting procedure, a failure to provide required information, or the inadequacy of a review procedure. They would be able to do so by contacting KommAustria’s complaint office, although they can only do so where (a) they have made an attempt to complain to the company and not had a response, or (b) they did receive a response but the user and the company haven’t been able to resolve the complaint. In such cases, the complaints office shall bring about “an amicable solution” by developing a proposed solution or inform the user and the service provider of their opinion on the case that has been raised.

If there are more than five well-founded complaints made, then § 9 provides that KommAustria must initiate a procedure to review the appropriateness of the company’s measures to meet its reporting and review procedure requirements. If KommAustria considers that they are inadequate, or that the company’s obligations are being “seriously violated”, then it can, in the first instance, “instruct the service provider to restore the lawful state of affairs and take suitable precautions to avoid future legal violations”. The company must comply with this instruction within 4 weeks. For second instances, or where the company fails to comply with an instruction, KommAustria can impose a fine.

There are some constraints in § 9 in that it requires KommAustria to take into account that companies cannot be required to conduct general prior checks of content, and only need to take measures which are suitable and proportionate.

KommAustria’s powers to impose sanctions for any non-compliance with the requirements of the KoPl-G are set out in § 10. The only sanction that can be imposed is a fine, although it can be up to EUR 10 million “depending on the severity of the violation” and a range of other factors such as the financial power of the company (based on its annual turnover), the number of registered users that it has, any previous violations, and the extent and duration of the non-compliance. A company which has been issued with a fine can appeal the decision to the Federal Administrative Court.

Separately, the KoPl-G also creates a range of administrative offences in § 11 for which responsible representatives can be punished. Offences of not keeping their contact details up to date or not being reachable by KommAustria at any time would be punished by a fine of up to EUR 10,000. Failing to take the due care expected of a representative to ensure that the company is meeting its requirements under the law would be punished by a fine of up to EUR 50,000.

What are the human rights impacts of the KoPl-G?

The major impact of the KoPl-G on human rights is, unfortunately, likely to be an adverse one on freedom of expression. While the proposals provide some safeguards for companies, ultimately it will require them to start making difficult determinations over whether specific pieces of reported content are illegal. Not only is this problematic in principle (really only courts or other independent and authoritative bodies should make such determinations), but the short time frame for decisionmaking and risk of financial penalty for systemic failures to remove illegal content provide powerful incentives to remove, when in doubt.

True, the safeguards are not insignificant. The content must be “objectively” illegal and the requirement to remove within 24 hours only applies if a legal layperson would know very quickly that it was so. There is also the possibility for users to appeal decisions to remove content. Furthermore, companies will be able to defend decisions to keep content up on the basis that there was a justification for doing so, and the financial penalties don’t appear to apply to a single, isolated instance of failing to remove a piece of content after review, even if it was ultimately found to be illegal. But, despite all of this, it will still be far easier for a company to simply remove a piece of content that might potentially be illegal rather than have to go through the risky process of later justifying keeping it up. One would hope that companies with sufficient resources, and a principled commitment to freedom of expression, such as those who are members of the Global Network Initiative, would limit removals to the absolute minimum, but it would be hard to blame a company which simply removed whenever in doubt. The approach that KommAustria takes toward enforcement will also be significant: if they give companies some leeway on the basis that they can’t be expected to monitor content generally, and only need to take measures which are proportionate, then this might help create a relationship whereby companies feel less pressured to remove content.

Exacerbating the risks to freedom of expression, however, is the broad scope of types of content that people will be able to report. While some are relatively clear cut (such as child sexual abuse imagery), others are potentially very broad: coercion, harassment, insult, disparagement of religious teachings. When law and enforcement agencies and courts are making these decisions, it will take weeks for all of the relevant evidence to be gathered, for the context to be assessed and understood, for a determination to be made only after all parties have had the opportunity to provide their arguments. And if the content is found to be illegal, the individual affected will be able to appeal that decision to a higher court arguing that it was a restriction on freedom of expression. None of these safeguards noted above provide anything even approaching an equivalent when companies make the decisions, as they will be obliged to under the proposals. While the sanction for the individual will not be the same (removal of content compared to a criminal conviction), the impact on the right to freedom of expression necessitates sufficient and equivalent safeguards.

Is there anything in the proposals to welcome? Well, as noted above, there are some safeguards that make the proposals an improvement on the NetzDG, which is otherwise very similar. How much of a difference these make in practice, would remain to be seen. And it is good to see requirements for greater transparency reporting. While the statistical information would need to be properly contextualised, it will certainly be helpful to start getting a better idea of the policies and processes that companies employ when it comes to content moderation, how decisions are made, and what kind of internal training staff receive. It would have been good to have seen the proposals contain more requirements when it comes to transparency over algorithms and automated content moderation processes, as this is increasingly how content is flagged and blocked.

Overall, the KoPl-G continues the trend of regulatory proposals relating to online content which raise serious concerns over potential impacts on freedom of expression. That trend of forcing companies to make decisions about what is and is not illegal, and at the same time putting in place powerful incentives to remove when in doubt, rather than take a balanced consideration, is a worrying one. At the same time, however, it is unreasonable to expect our existing law enforcement and judicial processes to be able to manage the vast quantity of online content, and that’s before considering questions of reach, speed and anonymity. While there is certainly a problem to be solved, these proposals are not the solution.

--

--

Richard Wingfield

Head of Legal at Global Partners Digital. Posts are written in a professional capacity. @rich_wing on Twitter.