Skip to content
logo The magazine for digital lifestyle and entertainment
Deepfake Issue

Court Requires Platforms to Actively Search for Illegal Content

A court ruling increases the responsibility of platforms regarding deepfakes.
A court ruling increases the responsibility of platforms regarding deepfakes. Photo: Getty Images

May 9, 2025, 12:02 pm | Read time: 2 minutes

Deceptively realistic deepfake videos featuring prominent faces are presenting new challenges for platform operators. However, a recent decision by the Higher Regional Court of Frankfurt clearly indicates that a single warning is sufficient, and similar content must be proactively blocked in the future.

Share article

According to a recent ruling by the Frankfurt Higher Regional Court, platforms must independently search for and remove similar content when alerted to illegal material, such as deepfakes. The case involved videos featuring celebrities promoting weight loss products.

Platform Ignored Nearly Identical Video

A prominent man was falsely depicted in several deepfake videos on a social media platform as endorsing weight loss products. In reality, he neither consented nor recommended the products. His face, name, and voice were misused.

Initially, the affected individual demanded the platform remove the content through a lawyer’s letter in July 2024. When that proved insufficient, he sought a preliminary injunction in August 2024 against the distribution of a specific video (“Video 1”).

Later, the case expanded. In November 2024, the man also requested a ban on a second, nearly identical video (“Video 2”), which contained the same statements and depictions. Although the Frankfurt District Court initially dismissed the lawsuit, the Frankfurt Higher Regional Court ruled differently in a recent judgment on these deepfakes.

Ruling on Deepfakes States Platform Must Actively Search for Similar Content

With its decision (Case No.: 16 W 10/25), highlighted by the German Bar Association, the Frankfurt Higher Regional Court clarified the legal situation. Platform operators—known as host providers—are obligated to actively search for and block similar content after a specific notice of a legal violation.

This is particularly true if the reported post is described so clearly that a potential legal violation is easily recognizable. The court emphasized that the provider must prevent the distribution “of content identical to that in Video 2.”

More on the topic

Technical Solutions Demanded

While current EU law prohibits a general monitoring obligation for host providers, the court still sees a responsibility for platforms to use technical means to identify and block nearly identical content—such as with minor differences in image or sound—after specific alerts.

In this case, “Video 1” and “Video 2” were almost identical, differing only in title and a few graphic details. The court ruled that the platform should have prevented their distribution without requiring a new warning for each similar video. A separate legal complaint for each similar post is not necessary according to the decision.

This article is a machine translation of the original German version of TECHBOOK and has been reviewed for accuracy and quality by a native speaker. For feedback, please contact us at info@techbook.de.

Topics #PriwattAmazon Artificial intelligence Data protection Fraud News Right
You have successfully withdrawn your consent to the processing of personal data through tracking and advertising when using this website. You can now consent to data processing again or object to legitimate interests.