Facebook bears responsibility for illegal content – ruling on Künast

News  >  Intern  >  Facebook bears responsibility for illegal content – ruling on Künast

Arbeitsrecht-Anwalt-Rechtsanwalt-Kanzlei-MTR Legal Rechtsanwälte
Steuerrecht-Anwalt-Rechtsanwalt-Kanzlei-MTR Legal Rechtsanwälte
Home-Anwalt-Rechtsanwalt-Kanzlei-MTR Legal Rechtsanwälte
Arbeitsrecht-Anwalt-Rechtsanwalt-Kanzlei-MTR Legal Rechtsanwälte

Judgment of the Higher Regional Court Frankfurt am Main: Platform Liability for Illegal User Content

The Higher Regional Court Frankfurt am Main issued a significant ruling on January 26, 2024 (Case No. 16 U 65/22) regarding the responsibility of social media platforms for illegal third-party content. The case centered on a cease-and-desist and deletion request by Member of the Bundestag Renate Künast against Meta Platforms Ireland Limited, operator of “Facebook.” The decision has far-reaching implications for the extent to which platform operators can be held liable for offensive and defamatory user posts.

Facts and Procedural History

Initial Situation

Renate Künast was confronted on the Facebook platform with statements she considered insulting and violating her general personality rights. The platform operator in question was requested to remove the respective content after being notified. Since no timely and complete deletion occurred, Künast filed a cease-and-desist and deletion application with the Regional Court, which initially partially granted her claim. Meta Platforms filed an appeal against this decision.

Legal Background

The core issue was under which conditions a platform operator is obliged to cease and desist and delete, pursuant to §§ 1004, 823 BGB by analogy and considering telemedia legal provisions, when rights to personality of third parties are violated by content published by third parties, for example, in the form of insults. Of particular relevance was assessing to what extent the provider can be held liable as a secondary offender and how the duties to examine and respond after receiving a complaint are structured.

Decision of the Higher Regional Court Frankfurt am Main

Scope of the Examination Duties

The Higher Regional Court affirmed that commercial platform operators, when notified of illegal content, must promptly and carefully verify whether a legal violation exists. This includes not only formal but also substantive assessments. The court clarified that mere acknowledgment of the reported post and its isolated examination is insufficient. Instead, the context must be considered, which may arise from comments, links, or available background information.

Liability and Cease-and-Desist Claims

In the specific case, the court found that the legal threshold for personality rights violation was exceeded. The content classified as insulting had to be removed without culpable delay after notification. According to the Higher Regional Court, the platform operator acted negligently by omitting or incompletely implementing the deletion. The court affirmed the platform’s liability for omitted or delayed deletions from the moment of receipt of a qualified notice.

Deletion of “Substantially Identical” Content

A key point of the decision was that the operator is not only required to delete the specifically reported content but must also independently prevent substantially identical violations on its platform. Thus, the platform has a proactive duty to remove similar attacks on personality rights of the same magnitude after interference is detected. The Higher Regional Court simultaneously emphasized that full automation of the examination process does not meet the requirements for careful handling.

Impacts and Classification

Significance for Platform Operators

This ruling concretizes and expands the due diligence and verification duties of platform operators. For companies in the social media sector, responsibility increases to process reported legal violations quickly and thoroughly. The decision especially sets a precedent for cases where acts—such as insults or defamations—recur consistently across structurally comparable contributions.

Personality Rights Protection

For affected users whose rights are infringed by offensive publications on social networks, the judgment strengthens the possibility to effectively enforce their protection claims, demanding not only formal deletions but also sustainable prevention of subsequent postings.

Further Procedural Steps and Note on the Legal Situation

Finally, it should be noted that further legal remedies may be available against the Higher Regional Court Frankfurt’s judgment. Compliance with data protection and media law provisions as well as the interaction with the Network Enforcement Act remain highly relevant in such contexts. The decision refers to a specific individual case and must always be assessed in the context of the particular circumstances and further supreme court jurisprudence.

Source: Judgment Higher Regional Court Frankfurt am Main, Case No. 16 U 65/22; Status: February 2024


In view of the continually growing challenges in the field of freedom of expression law and platform liability, it is advisable to consider legal counsel for complex legal issues regarding violations on the internet and the liability of platforms. The attorneys at MTR Legal possess extensive experience in the relevant areas of law and support companies as well as individuals in asserting and defending their rights.

Your first step towards legal clarity!

Book your consultation – choose your preferred appointment online or call us.
International Hotline
now available

book a callback now

or send us a message!