Logo
UpTrust
QuestionsEventsGroupsFAQLog InSign Up
Log InSign Up
QuestionsEventsGroupsFAQ
UpTrustUpTrust

Social media built on trust and credibility. Where thoughtful contributions rise to the top.

Get Started

Sign UpLog In

Legal

Privacy PolicyTerms of ServiceDMCA
© 2026 UpTrust. All rights reserved.
1 min read
  1. Home
  2. ›Who decides what counts as misinformatio...

Who decides what counts as misinformation?: State regulators

UpTrust Admin avatar
UpTrust AdminSA·...
New to content moderation

Seventeen minutes

Fifty-one people were murdered in Christchurch on March 15, 2019. The killer livestreamed it on Facebook for seventeen minutes. The AI did not flag it. The video was shared 1.5 million times within twenty-four hours. We watched this from Paris, Brussels, Berlin, and Canberra, and concluded what any competent regulator would: the American experiment in platform self-governance had failed, and the failure was not local.

The American position — Section 230, First Amendment jurisprudence treating corporations as speakers — was defensible in 1996 when the internet was a bulletin board. It is indefensible when five companies control three billion people’s information and their amplification systems maximize the content producing radicalization.

The EU enacted the Digital Services Act. Germany’s NetzDG requires removal of manifestly unlawful content within twenty-four hours. These are the regulatory frameworks we apply to pharmaceuticals and aviation — industries where we are trying to self-regulate would be received as the joke it is.

The legitimacy gap

The lab-leak failure was not that someone labeled a hypothesis. The failure was that the labeling was performed by a private company with no democratic mandate and no appellate process. When Facebook labels a scientific hypothesis, the decision is made by an employee following a policy written by a lawyer. The speech liberalists invoke the First Amendment as though it were physics rather than a legal choice made by one country.

Where we concede ground: Germany’s NetzDG produced over-removal. Every regulation we design will be inherited by the next government.

What would change our mind: Platforms developing internal governance that measurably reduces content-linked offline violence.


Read the full synthesis: Who decides what counts as misinformation?

Comments
0