Ad Code

Responsive Advertisement

Ad Code

Responsive Advertisement

Search This Blog

$ok={X} $days={7}

Our website uses cookies to improve your experience. Learn more

Slider

5/recent/slider

The third party didn’t exist, Falana’s lawyer explains why Nigerian court ruled against Meta


When a Lagos High Court ruled that Meta was liable for a Facebook video falsely claiming that renowned human rights lawyer Femi Falana, SAN, was terminally ill, the decision immediately stirred debate across Nigeria’s legal and tech ecosystems. The core questions were simple but uncomfortable for Big Tech: why should a platform be held responsible for content it didn’t create, and why did the court treat the case as a privacy breach rather than defamation?

According to Falana’s lawyer, Olumide Babalola, the answers sit squarely at the intersection of privacy law and platform accountability, and they hinge on one critical fact: there was no identifiable third-party publisher.

The video in question portrayed Falana as granting an interview about a serious illness he does not have. While such a claim could ordinarily fall under defamation, Babalola said the legal team deliberately framed the case as a privacy violation because it involved false health information.

A person’s health status, he explained, is deeply personal data. Publishing false information about it without consent amounts to an invasion of privacy, regardless of intent.

Privacy law, as applied by the Nigerian courts, goes beyond secrecy. It protects individuals from being placed in a false light, from the misuse of their image, and from the disclosure of private facts. In this case, the video attributed a medical condition to Falana that never existed, triggering a clear breach of his constitutional right to privacy.

The more controversial aspect of the case was Meta’s role. Rather than suing the individual or organisation that posted the video, Falana’s legal team went after the platform itself.

Read More:Sterling Bank Partners Thunes to Chase Nigeria’s $20bn Remittance Market

Babalola said that decision was not strategic opportunism but a direct consequence of failed traceability. The Facebook page that posted the video was identified as Afri Health Centre, but extensive efforts to verify its existence led nowhere.

According to him, there was no registered entity, no contact details, no identifiable individuals, and no presence outside Facebook. In short, the publisher could not be found.

The expectation, Babalola said, was that Meta, as the platform owner, would be able to identify or produce the page owner during legal proceedings. That did not happen. Meta failed to present any verifiable third party responsible for the content.

That failure proved decisive.

In the absence of an identifiable publisher, the court was left with only two parties: the victim and the platform. And under those circumstances, the court rejected Meta’s claim to intermediary immunity.

Babalola summed it up bluntly: if a platform claims it is merely an intermediary, there must be an actual third party. In this case, none existed. If no one else could be shown to have published the content, liability fell on Meta.

The court also appeared unconvinced that adequate vetting or safeguards were in place to prevent anonymous pages from spreading harmful content without traceability. That weakness, Babalola argued, made it impossible to shield the platform from responsibility.

The judgement, he said, marks one of the most significant platform accountability decisions in Africa, even if the facts of the case are unusual.

The long-standing argument that platforms should enjoy blanket immunity because of their scale is losing credibility. Liability can be limited, Babalola said, but it cannot be erased entirely.

For African users, the ruling creates a new legal pathway, particularly in cases involving fake pages, anonymous accounts, or slow platform responses to harmful content. It also raises the stakes for global tech companies operating in jurisdictions where courts are increasingly willing to interrogate how platforms are run, not just what users post.

Babalola believes platforms could significantly reduce abuse and legal exposure by making it clear that users who violate privacy or engage in cyberbullying can be identified.

Freedom of expression, he noted, is not absolute. Once it crosses into the violation of another person’s rights, consequences must follow.

While the judgement may not apply universally, its impact is already being felt. Across Africa’s courts, the assumption of automatic platform immunity is no longer guaranteed, and that shift could reshape how tech companies approach accountability on the continent.



Post a Comment