The Green Lock Is a Lie
The Green Lock Is a Lie
Why the little padlock in your browser protects the people who sell it – not you.
By Markus Maiwald
You see it every day. A tiny padlock – green, grey, sometimes with a company name beside it – sitting in your browser’s address bar. You were trained to look for it. Your bank told you. Your IT department told you. Apple told you. “If the padlock is there, you’re safe.”
You were lied to. Not by accident. By design.
That padlock is not a security feature. It is a receipt. Proof that someone – a company you have never heard of, in a country you have never visited – accepted a payment and stamped a digital form. The padlock does not mean the website is safe. It does not mean the website is honest. It does not mean nobody is watching.
It means someone paid their subscription.
Who Sells Trust?
Behind that padlock sits an industry called Certificate Authorities – CAs for short. Companies like DigiCert, Sectigo, and Entrust. Their business model is elegant in its absurdity: they sell permission slips that allow websites to show you the padlock.
No padlock? Your browser throws a full-screen warning. Skull and crossbones. “THIS SITE IS NOT SECURE.” The user panics. The business loses customers. So the business pays the CA. The warning disappears. The padlock returns.
This is not a security service. This is a protection racket with a billing department.
Think about what is actually happening. A website proves it controls its own domain name – something that takes about twelve seconds of automated checking. For this twelve-second check, enterprises pay $500 a year. Sometimes more. And the “trust” that transaction buys? It is functionally identical to what a free service called Let’s Encrypt provides for zero dollars, zero phone calls, and zero human involvement whatsoever.
The expensive padlock and the free padlock do the same thing. They always did.
An Industry That Failed Every Test It Set for Itself
For the social scientists reading this – the ones who study institutional legitimacy, regulatory frameworks, trust architectures – here is your case study. Here is what happens when you hand a critical public good to a small cartel of private companies and ask them to self-regulate.
They fail. Not occasionally. Structurally.
In 2011, a Certificate Authority called DigiNotar was hacked. The attackers forged certificates for Google – meaning they could intercept the emails and searches of anyone whose browser trusted DigiNotar. Which was every browser on Earth. The Iranian government used these forged certificates to surveil its own citizens during the Arab Spring. DigiNotar collapsed. But the damage – mass surveillance of dissidents – was already done.
Same year. Comodo – one of the largest CAs in the world – was hacked. Forged certificates for Google, Yahoo, Skype, and Mozilla. Same vulnerability. Same architectural failure.
In 2015, Symantec – a name you probably recognize from the antivirus software your office IT department installed on your laptop – was caught issuing thousands of unauthorized test certificates for domains including google.com. Not hacked. Not a rogue employee. A systemic failure of internal controls at one of the world’s largest “security” companies.
In 2019, a company called DarkMatter – a known surveillance contractor for the United Arab Emirates – applied to be included in browser trust stores. They wanted to become a Certificate Authority. Mozilla, the organization behind Firefox, actually debated whether to let them in. A spy agency. Asking for the keys to the internet’s trust infrastructure. And the system had no mechanism to categorically reject them; only a committee discussion.
This is not a list of edge cases. This is the steady state of institutional trust when the institution has no accountability mechanism except its own reputation – and its customers have no alternative except to accept the padlock or be locked out of the modern web.
The Part Nobody Wants to Say Out Loud
Here is the question that the institutional faithful will not ask, because the answer dissolves the framework they built their careers on:
What stops a government from compelling a Certificate Authority to issue a forged certificate?
Nothing. Absolutely nothing.
A CA that operates under the jurisdiction of any nation-state can be compelled – by court order, by national security letter, by a quiet meeting in a government office – to issue a certificate for any domain. That certificate allows the government to intercept, read, and modify any communication between the user and the website. A perfect man-in-the-middle attack, invisible to the user, blessed by the padlock.
The CA does not need to be hacked. It needs to be asked.
You think the NSA – the agency that built PRISM, that tapped undersea cables, that had agreements with every major telecom – does not have access to a root Certificate Authority? You think the intelligence services of China, Russia, Israel, or the United Kingdom respect the sanctity of DigiCert’s signing key?
The padlock is not protecting you from surveillance. The padlock is the infrastructure of surveillance. A green light that tells you to relax while the wire is already tapped.
”But I Trust Apple”
Here is where it gets personal for the Apple faithful.
Your iPhone, your MacBook, your iPad – each ships with a pre-installed list of roughly 150 organizations whose word your device will accept without question. These are the root Certificate Authorities. Apple chose them. You did not. You were not consulted. You cannot meaningfully remove them without breaking half the internet on your device.
Apple – the company that markets itself as a privacy fortress – ships your device pre-configured to trust 150 organizations, many of which operate under the jurisdiction of governments that conduct mass surveillance. Some of these CAs are state-owned. Some are based in countries with no independent judiciary. Some have already been caught issuing fraudulent certificates.
Your device trusts them because Apple decided it should. Not because you decided. Not because they earned it. Because the system requires your silent compliance to function.
This is not privacy. It is the appearance of privacy, sold at a premium.
How Trust Actually Works
Here is something every social scientist knows but the technology industry pretends to have forgotten: trust is not binary.
You do not “trust” or “not trust” a person. You trust your neighbor to water your plants but not to manage your retirement fund. You trust your doctor’s medical judgment but not her stock tips. Trust is contextual. Scalar. It depends on the domain, the history, and the stakes.
The Certificate Authority model ignores all of this. It is a light switch. On or off. Trusted or not trusted. DigiCert says this website is legitimate? Then every browser on Earth trusts it completely, in every context, for every purpose, until the certificate expires.
This is not a model of trust. It is a caricature of trust – designed by engineers who needed a simple boolean and never looked back.
What would real trust look like? It would look like how humans actually evaluate reliability:
- Trust would be earned, not purchased. The people in your community vouch for you because they know you – not because you paid a fee.
- Trust would decay over time. If you have not interacted with someone in a year, your confidence in them naturally decreases. A certificate that says “trusted until 2027” is a fiction.
- Trust would be contextual. I might trust you to relay a message but not to handle my money. A system that cannot express this distinction is architecturally blind.
- Trust would be revocable in seconds, not days. When someone betrays trust, the community knows immediately. Not after a seven-day certificate revocation list refresh.
What We Built Instead
Libertaria – the decentralized protocol stack we have been building – eliminates Certificate Authorities entirely. Not by asking them to reform. Not by petitioning regulators. By making them architecturally unnecessary.
Here is how, without a single line of code:
Identity is self-issued. Every person and every device in the Libertaria network creates their own cryptographic identity. No authority stamps it. No company validates it. The identity is yours because you created it and your community recognizes it – exactly like identity works in every human society that has ever existed before bureaucrats got involved.
Trust is computed, not purchased. Instead of asking DigiCert whether to trust someone, the network computes a trust score based on your own experience and the attestations of people you already trust. The score is a number between 0 and 1 – not a binary yes/no. It decays over time. It is different in different contexts. It is your assessment, not a corporation’s.
Revocation is instant. When trust is betrayed, the information propagates through the network in seconds – not the hours or days that Certificate Authority revocation lists require. The network is its own immune system.
New members are vouched in by community. A newcomer gets cryptographic vouches from three existing members. These vouches expire in 48 hours if the newcomer does not establish their own reputation. No zombie identities. No phantom certificates. No ghosts in the machine.
| What Changes | The Old Way (CAs) | The New Way (Libertaria) |
|---|---|---|
| Who decides trust? | A company you have never met | People in your community |
| What is at stake? | Revenue (zero marginal cost to issue) | Reputation (social cost to vouch) |
| Can trust be bought? | Yes. That is the business model. | Only at the cost of your standing |
| Can trust be coerced by government? | Yes. National security letters. | No. Cryptographic sovereignty. |
| Does trust decay? | No. Valid until an arbitrary expiry date. | Yes. Naturally, like real trust. |
| Is trust contextual? | No. Binary: trusted or not. | Yes. 0.0 to 1.0, domain-dependent. |
| How fast is revocation? | Hours to days (if it works at all) | Seconds |
The Institutional Objection – and Its Autopsy
The institutional reflex, the one drilled into every political science seminar and every law review, goes like this: “But without institutions, there is chaos. Without authorities, there is no accountability. Without centralized trust, there is no trust.”
This objection mistakes the map for the territory.
Institutions are not trust. They are proxies for trust – intermediaries that exist because, at one point in history, we lacked the technology to compute trust directly. You needed a church to verify marriages because there was no other shared ledger. You needed a bank to verify payments because there was no other way to prove a transaction happened. You needed a Certificate Authority to verify websites because there was no other mechanism to bind a public key to an identity.
That last condition is no longer true. The technology exists. The cryptography works. The network functions.
The institution persists not because it is needed but because it is profitable. And because an entire professional class – auditors, regulators, compliance officers, security consultants, and yes, academics who study institutional legitimacy – has built careers on the assumption that the intermediary is permanent.
It is not. It never was. It was a workaround. And the workaround just expired.
The Exit
Next time you see the green padlock, understand what it actually represents: a private company’s willingness to stamp a form in exchange for money. Nothing more.
The padlock does not know if the website is honest. It does not know if someone is reading your traffic. It does not know if the certificate was issued under government pressure. It does not know anything. It is a paid logo in your address bar.
We did not petition the Certificate Authorities to change. We did not lobby regulators. We did not write policy papers. We did not ask permission.
We wrote mathematics that makes them irrelevant.
Trust is not a product. It is a computation. And computations do not need a middleman.
This article is the non-technical companion to The CA Is Dead – How Libertaria Eliminated Certificate Authorities, which covers the protocol-level implementation. For the full technical specification, see RFC-0016: Sovereign PKI.