Higher Regional Court of Frankfurt am Main confirms Facebook’s right to delete misleading posts about COVID-19 vaccinations
The Higher Regional Court (OLG) of Frankfurt am Main, in its judgment of 14 November 2024 (Ref. 16 U 52/23), has made a key ruling regarding the moderation and control of content on social networks. In the specific case, the court decided that the operator of the social network Facebook was entitled to remove various posts which, according to the court’s assessment, disseminated misinformation about the effectiveness and potential risks of administering COVID-19 vaccines. The decision highlights the balance between freedom of expression and the protection of users from disinformation, particularly in matters of significant societal importance such as public health.
Background of the court decision
In the underlying proceedings, a user sued Facebook after several of his posts about COVID-19 vaccination were deleted by the company. The entries included, among other things, claims that the vaccines were “ineffective” and “highly dangerous.” Facebook primarily based the deletions on its internal community standards and the platform’s terms of use, which prohibit misinformation and health-related disinformation.
The OLG Frankfurt am Main examined whether these contractual terms provided a sufficient basis for the removal of the disputed content and whether the freedom of expression guaranteed by the Basic Law was disproportionately restricted for the user as a result.
Principle of freedom of expression and its limitations
The court emphasized that freedom of expression under Article 5 of the Basic Law holds great importance and, in principle, also protects the expression of conspicuous or exaggerated opinions. However, this freedom finds its limits where factual assertions are made that are objectively inaccurate and, as a result, may significantly impair other legal interests or public discourse. With regard to COVID-19 vaccination, the court found that scientific consensus and ongoing evidence-based data collection have established a clear factual basis.
Posting demonstrably false facts (such as the general “ineffectiveness” of the vaccines or blanket statements about their “dangerousness”) that contradict evidence-based knowledge can, therefore, be neither covered by freedom of expression nor protected from content moderation on private platforms.
Relevant aspects of the decision
Contract interpretation and community standards
According to the OLG Frankfurt am Main, the community standards applicable within platform communication are communicated early and transparently in light of the user agreement. This creates a legitimate expectation for users from the outset that certain content boundaries must be observed. If these boundaries are exceeded, the platform operator can lawfully take appropriate measures – such as removing posts.
Balancing decisions
The court further stated that, in individual cases, a balance must always be struck between the protection of public health, the integrity of discourse, and the interests of the expressing user. In the present case, the interest of the general public—especially in light of the pandemic situation and the massive potential for harm from misinformation—outweighs the interest in unlimited dissemination of personal assessments involving factual claims.
Scope of the decision
It should be noted that the authority to delete can reach its limits if the disputed content does not constitute factual assertions but merely value judgments or expressions of opinion that are not demonstrably false. The ruling therefore does not constitute a blank check for unlimited content deletion on social platforms, but rather clarifies the standards for an appropriate balance between freedom of expression and the platform operators’ content moderation practices.
Implications for practice
The decision of the OLG Frankfurt am Main follows recent case law that sees an increasing responsibility for social network operators in combating disinformation—especially where public health and the public good are concerned. At the same time, the requirements for transparency and clarity of the terms of use are emphasized. Platform operators must ensure that users are adequately informed about the applicable rules and that there is a graduated moderation process with opportunities for comment or complaint.
Final remark
For companies and individuals faced with issues relating to digital freedom of expression, platform regulation or other aspects of media law, the recent decision by the OLG Frankfurt am Main is of particular relevance. Should doubts or further legal concerns arise regarding freedom of expression, the implementation of community standards, or moderation practices on platforms, the lawyers at MTR Legal are happy to provide in-depth legal support.