Freedom.gov, the DSA, and the Ancient Question No Algorithm Can Answer
There is a horse galloping above the Earth.
If you visit freedom.gov right now — and I encourage you to — you will find no content, no VPN toggle, no catalogue of banned tweets. What you will find is a ghostly horse, a logo hovering over the planet, and a sentence that reads: "Information is power. Reclaim your human right to free expression. Get ready."
Get ready. For what, exactly?
I want to suggest that what we should get ready for is not a new website. What we should get ready for is the most consequential philosophical collision of the decade — one dressed up as a trade dispute, disguised as a tech policy disagreement, but which is, at its core, the oldest question in political philosophy: Who decides what you are allowed to say, and on what authority?
I. Two Freedoms Walk Into a Server Room
Let me frame this simply.
The European Union, through the Digital Services Act — Regulation 2022/2065 — has built a sophisticated architecture of platform accountability. The DSA does not, technically, tell anyone what to say. It tells platforms what to do about what people say. It creates obligations of transparency, risk assessment, independent auditing. It distinguishes between illegal content as defined by national law and content that creates "systemic risks" — a category elastic enough to include disinformation, electoral manipulation, and threats to public health.
The United States, through the reported freedom.gov initiative led by Under Secretary for Public Diplomacy Sarah Rogers, proposes something radically different: a state-backed portal, potentially equipped with a built-in VPN, designed to let European citizens access content that their own democratically enacted laws have deemed unlawful. The State Department frames this as "digital freedom." The EU sees it as state-enabled circumvention of sovereign regulation.
Now — both sides believe they are defending freedom. And here is where it gets philosophically interesting: they are both right. They are just not talking about the same freedom.
II. Isaiah Berlin's Internet Connection
In 1958, Isaiah Berlin delivered a lecture that every person working in digital governance should be required to read. He called it "Two Concepts of Liberty."
Negative liberty is freedom from — from interference, from constraint, from anyone telling you what you cannot do. It is the liberty of the open field. The American First Amendment is its cathedral. Its logic runs: the answer to bad speech is more speech, never less. The state's only job is to stay out of the way.
Positive liberty is freedom to — to participate in a functioning public sphere, to be free from manipulation, to live in an information environment that does not systematically distort your capacity for self-governance. The DSA, with its systemic risk assessments and algorithmic transparency requirements, is built squarely on this tradition. Its logic runs: you cannot be truly free if the infrastructure of public discourse is optimized for engagement over truth, for outrage over deliberation.
Freedom.gov is an instrument of negative liberty deployed against a regime of positive liberty. It says: your guardrails are our censorship. The DSA responds: your freedom is our systemic risk.
And this is not a misunderstanding. It is an irreducible tension. One that no technical standard, no trade agreement, no diplomatic communiqué can dissolve — because it is not a policy disagreement. It is a philosophical one.
III. The Sovereignty Paradox
Let us be legally precise for a moment, because legal precision matters when governments start building VPNs.
Article 14 of the DSA requires providers of intermediary services to act "in a diligent, objective and proportionate manner" with "due regard to the rights and legitimate interests of all parties involved, including the fundamental rights of the recipients of the service, such as the freedom of expression, freedom and pluralism of the media, and other fundamental rights and freedoms as enshrined in the Charter."
Read that again. The DSA explicitly protects freedom of expression. It does not authorize blanket censorship. It creates a duty of proportionality — rooted in the European Court of Human Rights jurisprudence — that demands platforms balance competing rights rather than suppress speech indiscriminately.
Now consider what freedom.gov proposes: routing European internet traffic through American infrastructure so that EU content moderation orders become technically unenforceable. This is not merely a policy tool. It is, in diplomatic terms, the digital equivalent of a state-sponsored smuggling operation — except what is being smuggled is not contraband, but content.
And here lies the sovereignty paradox: the United States, which has historically defended the principle that nations have the right to govern their own digital space, is now building a tool designed to make another bloc's domestic law unenforceable. Not through negotiation. Not through a WTO challenge. Through infrastructure.
When Polish President Karol Nawrocki vetoed his country's DSA implementation bill, he invoked Orwell: "The most effective way to take away freedom is not by banning speech, but by imposing a single, officially accepted version of reality." But one might equally say: the most effective way to undermine sovereignty is not through military force, but by making a nation's laws technically irrelevant.
IV. The Cathedral and the Bazaar
We are witnessing the emergence of two competing models for the governance of online speech, and they are both attempting to become universal.
The European model says: platforms are not neutral pipes, they are architects of public discourse, and as such they carry duties analogous to broadcasters, publishers, or public utilities. The DSA operationalizes this through a tiered system — the larger the platform, the heavier the obligation. Very Large Online Platforms must conduct systemic risk assessments, submit to independent audits, and maintain transparency databases of content moderation decisions. Two years in, nearly fifty million content moderation decisions by platforms have been reversed through user appeals — a figure that suggests the system is not merely censorious but genuinely procedural.
The American model — at least as currently articulated by the administration — says: the only legitimate baseline for online speech is the First Amendment. Any departure from that baseline, by any government, is censorship. And censorship anywhere threatens Americans, because US-based platforms operate globally and will inevitably apply the most restrictive standard uniformly.
Freedom.gov is the logical endpoint of this second model. If you truly believe that European content moderation is illegitimate censorship, then building a tool to circumvent it is not provocation — it is, in this worldview, a moral obligation.
But here is what I want you to sit with: neither model is self-evidently correct. Both contain real insights and real dangers.
The European model risks regulatory overreach — as even the DSA's own Recital 27 acknowledges when it warns against "possible negative effects on the availability and accessibility of information that is not illegal content." Upload filters, aggressive takedown timelines, and the sheer volume of daily content make over-enforcement not just possible but probable.
The American model risks regulatory nihilism — the pretence that a speech environment dominated by five private companies, optimized for engagement through algorithms no one elected and few understand, is somehow a "free marketplace of ideas" that requires no public oversight whatsoever.
V. What the Horse Knows That the Guardrail Doesn't
Let me close with something that has nothing to do with technology and everything to do with education.
Socrates did not teach by providing answers. He taught by asking questions. And not comfortable questions — dangerous ones. Questions that made Athenians doubt their gods, their laws, their certainties. The Athenian state, much like a modern content moderator, eventually decided that some of those questions posed a "systemic risk" to the social order. They gave him hemlock. Two and a half thousand years later, we remember Socrates. We do not remember the names of the jurors.
Here is what I believe: questions are the elementary unit of education. Not answers. Not curated information. Not pre-approved narratives. Questions. The awkward ones. The offensive ones. The ones that make institutions uncomfortable. A child who asks "why?" ten times in a row is not being disruptive — she is learning. A citizen who encounters an idea classified as "disinformation" and asks "but is it true?" is not being radicalised — she is thinking.
And this is where the DSA, for all its procedural elegance, commits a philosophical error that no amount of proportionality analysis can correct. When you build a system where a regulator defines what constitutes a "systemic risk" to public discourse — when a bureaucratic body determines which questions are too dangerous to circulate, which answers are too destabilizing to be accessible — you have not created a safer information environment. You have created a catechism. You have replaced education with instruction. You have told the citizen: here is the approved syllabus of reality. Study it. Do not look further.
The DSA's Article 34 requires Very Large Online Platforms to assess "systemic risks," including risks to "civic discourse" and "electoral processes." But who decides what speech threatens civic discourse? In practice, it is the European Commission — an unelected body — working with national Digital Services Coordinators, issuing guidelines on what platforms should suppress before, during, and after elections.
This is what the DSA ultimately does: it decides, in advance, which answers are acceptable. And once you have decided which answers are acceptable, the questions become irrelevant. Not forbidden, technically — just pointless. You may ask anything you like, as long as the only available responses have been pre-filtered through a risk assessment. This is not dialogue. It is theatre.
Freedom.gov — imperfect, provocative, diplomatically reckless as it may be — does something the DSA cannot. It trusts the question. It says: you are an adult, and you are capable of encountering information that your government considers dangerous, and forming your own judgment. It bets on the human capacity for discernment over the institutional capacity for curation. It is, in this sense, the more deeply educational project.
I know that an unfiltered internet contains hate speech, conspiracy theories, recruitment propaganda, and outright lies. But I also know that every authoritarian regime in history has justified censorship by pointing to the genuine dangers of uncontrolled speech. The question is never whether dangerous speech exists. The question is whether you trust citizens more than you trust regulators.
The DSA answers: we trust regulators.
Freedom.gov answers: we trust citizens.
I know where I stand.
Not because I think all speech is harmless — it isn't. Not because I think platforms need no accountability — they do. But because I believe that the capacity to ask an open question and receive an unfiltered answer is the precondition for every other freedom. Take away the question, and you do not need to ban the books. They become irrelevant on their own.
Socrates was killed for asking questions in a city that had decided it already possessed the truth. The DSA is not hemlock. But it operates on the same premise: that certain ideas are too dangerous for citizens to encounter without institutional mediation. And I refuse that premise. Not because I am brave, but because I am educated — and I was educated by questions, not by guardrails.
The ghost horse on freedom.gov gallops above the Earth. Perhaps it carries nothing yet. But it gallops toward something the DSA, in all its five hundred pages, never once provides: an open question, addressed to a free mind, with no predetermined answer.
That is not chaos. That is education. And education — not regulation — is how free societies have always protected themselves.

