May 9, 2025, 12:02 pm | Read time: 2 minutes
Deceptively realistic deepfake videos featuring prominent faces are presenting new challenges for platform operators. However, a recent decision by the Higher Regional Court of Frankfurt clearly indicates that a single warning is sufficient, and similar content must be proactively blocked in the future.
According to a recent ruling by the Frankfurt Higher Regional Court, platforms must independently search for and remove similar content when alerted to illegal material, such as deepfakes. The case involved videos featuring celebrities promoting weight loss products.
Platform Ignored Nearly Identical Video
A prominent man was falsely depicted in several deepfake videos on a social media platform as endorsing weight loss products. In reality, he neither consented nor recommended the products. His face, name, and voice were misused.
Initially, the affected individual demanded the platform remove the content through a lawyer’s letter in July 2024. When that proved insufficient, he sought a preliminary injunction in August 2024 against the distribution of a specific video (“Video 1”).
Later, the case expanded. In November 2024, the man also requested a ban on a second, nearly identical video (“Video 2”), which contained the same statements and depictions. Although the Frankfurt District Court initially dismissed the lawsuit, the Frankfurt Higher Regional Court ruled differently in a recent judgment on these deepfakes.
Ruling on Deepfakes States Platform Must Actively Search for Similar Content
With its decision (Case No.: 16 W 10/25), highlighted by the German Bar Association, the Frankfurt Higher Regional Court clarified the legal situation. Platform operators—known as host providers—are obligated to actively search for and block similar content after a specific notice of a legal violation.
This is particularly true if the reported post is described so clearly that a potential legal violation is easily recognizable. The court emphasized that the provider must prevent the distribution “of content identical to that in Video 2.”

These are neobrokers – this is what makes them different

Google lawsuit: Internet giant faces restrictions after losing court case

What Actually Happened to Kim Dotcom?
Technical Solutions Demanded
While current EU law prohibits a general monitoring obligation for host providers, the court still sees a responsibility for platforms to use technical means to identify and block nearly identical content—such as with minor differences in image or sound—after specific alerts.
In this case, “Video 1” and “Video 2” were almost identical, differing only in title and a few graphic details. The court ruled that the platform should have prevented their distribution without requiring a new warning for each similar video. A separate legal complaint for each similar post is not necessary according to the decision.