Right. So last Tuesday, a contractor on one of our projects — smart bloke, solid engineer — sends me a Slack message asking if I can “PGP encrypt” a document before sending it over. For security, obviously.
I sat there for a bit. Stared at the cursor. Made another coffee.
Then I typed: “I’ll send it on Signal. What’s your number?”
He was confused, and honestly, I get it. PGP feels like the serious, proper option. It’s been around forever. It’s got “cryptography” written all over it — key rings and fingerprints and ASCII-armoured blocks that look impressively incomprehensible. If you learned about encryption at any point in the last thirty years, someone probably told you PGP was the gold standard.
They were wrong. Not maliciously — just… out of date. And I can’t keep having this conversation one Slack message at a time, so here we are. Sunday afternoon, a cup of tea, and a blog post that I suspect will annoy some people I quite like.
tl;dr — PGP is a 1990s protocol with 1990s cryptography and 1990s usability. Every serious cryptographer I respect has been saying this for years. Use Signal on your phone instead. Use Magic Wormhole for file transfers. Use age for encryption at rest. Stop using PGP. I say this with love.
The People Who Changed My Mind
I want to be honest about this: I used to use PGP. I had a key. I went to a key-signing event once. I thought I was very clever. I wasn’t clever. I was just doing what everyone told me to do, without thinking hard enough about whether it actually worked.
The people who changed my mind are some of the most respected voices in applied cryptography. I’m drawing on their work heavily here, and you should read them directly — they’re better at explaining this than I am.
Thomas Ptacek (tptacek on Hacker News, @tqbf on socials) — co-founded Matasano Security, then Latacora. His essay “The PGP Problem” [1] is the single most comprehensive takedown of the protocol ever written. It’s long, it’s detailed, and it changed how I think about cryptographic systems design. Ptacek has also been consistently vocal on Hacker News [8] about the specific ways PGP fails in practice — not just in theory, but in the lived experience of people trying to use it for real things.
Moxie Marlinspike — founded Open Whisper Systems, created Signal, and co-designed the Signal Protocol with Trevor Perrin. Moxie didn’t just criticise PGP — he built the thing that replaced it for messaging. His essay “The Ecosystem is Moving” [11] is a fascinating read about why he chose centralisation over federation, and why that trade-off matters for getting cryptography into the hands of people who actually need it. There’s a pragmatism there that I really respect.
Trevor Perrin — co-designed the Signal Protocol and the Double Ratchet algorithm [5] with Moxie, then went on to create the Noise Protocol Framework [6]. His 2017 talk at Real World Crypto [14] is the best technical overview of how these protocols actually fit together. Trevor’s work underpins how WhatsApp communicates with its servers, how WireGuard does its handshake, and a growing number of other protocols. He’s quietly one of the most influential cryptographic protocol designers working today, and I don’t think he gets enough credit.
Matthew Green — cryptography professor at Johns Hopkins. Wrote “What’s the Matter with PGP?” [2] back in 2014, when most of us were still dutifully exchanging key fingerprints at conferences like it meant something. He later wrote the definitive analysis of the EFAIL disclosure [12] — not just the vulnerability itself, but the way the PGP community responded to it, which is honestly the more damning part.
Filippo Valsorda — was the Go security lead at Google. In 2016, he publicly gave up his long-term PGP keys [3] and explained why in a post that I found genuinely moving. Here’s someone who’d been deep in the PGP ecosystem — offline master keys on a Raspberry Pi, YubiKeys, key-signing parties across continents, a paper in Phrack on fingerprint bruteforcing — and he walked away. Not because he didn’t understand PGP, but because he understood it too well. He later documented how the keyserver infrastructure is fundamentally broken [4], wrote about authentication in the age tool [16], and built age [18] itself — which is what PGP file encryption should have been all along.
Brian Warner — built Magic Wormhole [15] and presented it at PyCon 2016 [10]. If you haven’t tried Wormhole yet, set aside ten minutes. Seriously. It’s one of those tools where you use it for the first time and just grin. Simple, secure file transfer that doesn’t require exchanging keys or setting up accounts or any of the ceremony that makes PGP so exhausting.
Frank Denis (jedisct1 on GitHub [19]) — created libsodium, which is probably the most widely-used modern cryptographic library in the world, and Minisign, which brought Ed25519 package signing to every platform. If you’ve used any application that does cryptography sensibly in the last decade, there’s a good chance Frank’s code is somewhere in the stack.
Soatok — security engineer and applied cryptographer who’s been writing sharp, accessible criticism of PGP for years. Good at explaining why the alternatives are better without being dismissive of people who haven’t caught up yet. We need more of that.
Rob Locher wrote a thoughtful piece [9] that starts from a position of genuine sympathy for PGP — he cares about privacy rights and sees encryption as a tool for protecting them. I agree with all of that. Where I diverge is on whether PGP is still the right tool for that fight.
This isn’t a “well, some experts disagree” situation. This is a “the entire applied cryptography community has been trying to tell us something for a decade and we haven’t been listening” situation.
What’s Actually Wrong
I could write a book. Ptacek and Green basically already have. But here’s the version I’d give you over a cup of tea in Larnaca, and I’ll try to be fair about it.
The Cryptography Is Ancient
PGP defaults to 2048-bit RSA, 64-bit-block CAST5 in CFB mode, and an authentication mechanism called MDC that was bolted on as an afterthought in 2000. Someone finally noticed that PGP signatures weren’t authenticating ciphertext — which is a bit like realising your front door lock doesn’t actually engage the deadbolt — and the fix they came up with was to SHA-1 hash the plaintext, attach it to the plaintext, and then encrypt the whole lot in CFB mode.
I don’t want to be unkind about code that was written decades ago under very different constraints. Phil Zimmermann was fighting a genuinely important battle. But modern cryptography teaches us to authenticate ciphertexts not plaintexts, avoid CFB mode entirely, never use 64-bit block ciphers, prefer elliptic curves over RSA, and never mix compression with encryption. PGP violates ALL of these principles.
As Ptacek puts it in [1]: you can have backwards compatibility with the 1990s, or you can have sound cryptography. You cannot have both. And PGP chose the 1990s.
Key Management Is a Disaster (and TOFU Is the Honest Answer)
PGP’s answer to “how do I know this key belongs to this person?” is the Web of Trust. The idea is elegant: you go to a key-signing party, you check someone’s passport, you sign their key, and through chains of signatures, strangers can verify each other’s identities without a central authority.
I genuinely like this idea in the abstract. The problem is it doesn’t work in practice, and it never really did — not at scale.
Valsorda describes years of personal experience [3] where contacts either grabbed “the best-looking key from a keyserver” or just resent the message unencrypted when PGP didn’t cooperate. He’d done EVERYTHING right — offline master key, YubiKeys, key-signing parties on multiple continents — and his honest assessment was that the whole apparatus was functionally useless.
The Web of Trust worked for about fifty people at three conferences in the 1990s. For everyone else, it was always theatre.
Here’s what I think the uncomfortable truth is: trust-on-first-use (TOFU) is the only key management model that actually works at scale. It’s what Signal does. It’s what SSH does. It’s what WhatsApp does for two billion people. You see a key the first time you connect, you accept it, and if it ever changes, the system shouts at you. Is it theoretically weaker than a fully-verified Web of Trust? Yes. But a security model that people actually use will always beat one that’s technically superior but functionally imaginary.
Phone numbers as first-class credentials aren’t elegant. I know. But two billion people already have them, and key-signing parties — lovely as they were — belong to a world that doesn’t exist any more.
And then there are the keyservers. In 2019, someone demonstrated a trivial attack — just signing a target’s key thousands of times with throwaway keys — that caused GnuPG to grind to a halt for ten minutes trying to import the poisoned key [4]. Over 54,000 signatures on a single key. GnuPG’s key parser went quadratic. No cryptography was involved in the attack whatsoever. It was a parsing bug in a system that was designed to accept packets from anyone on the internet without any limits, and apparently nobody thought this might be a problem.
That’s not a bug. That’s an architecture that was never designed for a hostile internet. Which is strange, when you think about it, given that hostile internet is the entire threat model PGP is supposed to address.
No Forward Secrecy
This is the one that should properly frighten you.
If someone compromises your PGP key — today, next year, in ten years — they can decrypt every message ever encrypted to that key. Every. Single. One. All the way back to whenever you created the key.
Modern protocols use the Double Ratchet algorithm [5], co-designed by Trevor Perrin and Moxie Marlinspike. It combines a Diffie-Hellman ratchet with a symmetric-key ratchet to generate new encryption keys for every message. The maths is beautiful, actually — each ratchet step provides three properties: resilience (output keys look random), forward security (past keys stay random even if current keys are compromised), and break-in recovery (if an attacker briefly compromises a session, future keys become secure again once new entropy is mixed in).
Compromise one key in a Double Ratchet system, you get one message. PGP gives an attacker the whole archive.
If you assume — and at this point I think you really should — that nation-state adversaries are recording encrypted traffic right now to decrypt later when they get the compute or the key, then long-term PGP keys aren’t a security measure. They’re a liability. Valsorda makes this point beautifully in [3]: your long-term key is as secure as the least secure moment across its entire lifetime. Every dodgy laptop. Every backup you’re not quite sure about. Every OS you didn’t patch fast enough.
The Usability Will Actually Get People Hurt
This is where I get properly worried, because usability isn’t just a convenience issue — it’s a safety issue.
In 1999, Alma Whitten and Doug Tygar ran a landmark usability study [7] at UC Berkeley. They put technically literate people in a room with PGP and asked them to encrypt an email. The results were grim. People sent unencrypted messages thinking they were encrypted. They encrypted to the wrong keys. They couldn’t distinguish between public and private keys. Two hours later, most participants hadn’t managed a single successful encrypted exchange.
A follow-up study in 2015 [8] — SIXTEEN YEARS LATER — found the situation was essentially unchanged. Same problems. Same confusion. Same failure rate. With better software, better interfaces, and a decade and a half of iteration. If that doesn’t tell you this is a fundamental design problem rather than a UX polish issue, I don’t know what will.
And then EFAIL happened in 2018 [13]. Researchers demonstrated that email clients could be tricked into silently exfiltrating the plaintext of PGP-encrypted messages to an attacker’s server. It was accepted at USENIX Security and Black Hat — two of the most prestigious venues in the field. As Green wrote [12], it was “one of the best cryptographic attacks of the last five years” and “a pretty devastating indictment of the PGP ecosystem.”
The GnuPG project’s response was to blame the email clients. Green’s analysis of this is worth reading in full [12], because his point is sharp: when multiple independent clients all make the same security mistake with your API, the problem isn’t the clients. The problem is your API. The GnuPG library was releasing unauthenticated plaintext to callers before MDC validation completed. That’s a systemic design flaw.
The EFF — the Electronic Frontier Foundation, the people who’ve been fighting for encryption rights since before most of us had email — responded by recommending that people stop using PGP email plugins entirely [13]. They didn’t say this lightly. When the EFF tells you to stop using an encryption tool, it’s worth pausing and asking yourself why.
“But I Need Encrypted Email”
Do you, though?
I’m asking genuinely. Because email is fundamentally insecure. Even with PGP bolted on top, it’s default-plaintext. One wrong click and your carefully encrypted message gets forwarded in clear text, quoted in a reply, or CC’d to someone’s entire team. Subject lines? Never encrypted. Metadata — who’s talking to whom, when, how often? Always visible. Forward secrecy? None. Deniability? None.
Encrypting email is asking for a calamity. Recommending email encryption to at-risk users is malpractice. — Latacora [1]
That’s a strong statement. I’ve sat with it for a while. And I agree with every word of it.
If you’re a journalist protecting a source, or an activist in a country where the government reads your email, or a whistleblower — please, PLEASE, do not rely on PGP. The people who need encryption the most are the people most likely to be harmed by PGP’s failure modes. Use Signal. Use it on your phone. Use it today.
What to Actually Use
For Messages: Signal (On Your Phone)
Signal on your phone. Not the desktop client — the phone app, where the key material lives on a device you physically control.
The Signal Protocol — co-designed by Moxie Marlinspike and Trevor Perrin — gives you authenticated key exchange, forward secrecy via the Double Ratchet [5], deniable messages, and modern cryptographic primitives throughout. Trevor went on to formalise many of the underlying patterns into the Noise Protocol Framework [6], which has since been adopted well beyond messaging — WireGuard uses it, WhatsApp uses it for client-server transport, and a growing number of protocols are building on it.
Signal is paranoid about metadata in the best possible way. Sealed sender. Private contact discovery. Encrypted profiles where even the server can’t see your name or photo. When subpoenaed, they’ve been able to hand over essentially nothing — because they genuinely don’t have it. That’s how you design a system.
As Moxie argued in “The Ecosystem is Moving” [11], there’s a genuine tension between federation and the ability to iterate quickly on cryptographic protocols. Signal chose centralisation deliberately, not out of laziness but because it allows them to upgrade every user simultaneously. When you find a vulnerability or want to roll out a better primitive, you can do it in one release. Federated systems get stuck at the lowest common denominator — which, as PGP demonstrates, can be the lowest common denominator from 1997.
WhatsApp: The Pragmatic Option
Here’s the bit that makes PGP purists twitch: WhatsApp uses the Signal Protocol too. Moxie and the Open Whisper Systems team worked directly with WhatsApp to integrate it in 2016. End-to-end encryption for text messages, voice and video calls, photos, videos, voice messages, documents, and file attachments — all of it encrypted with the Signal Protocol, keys never leaving your device.
Yes, it’s owned by Meta. Yes, I know. I KNOW.
The caveats matter, and I want to be honest about them:
- The client is closed-source. You’re trusting Meta not to do something stupid or malicious with the app itself. Signal’s client is open-source and auditable.
- Cloud backups were historically a gaping hole. WhatsApp backups to iCloud or Google Drive were unencrypted for years — meaning all your “encrypted” messages sat in plaintext on Apple or Google’s servers. They’ve since added optional end-to-end encrypted backups, but it’s opt-in, and I’d wager most people haven’t turned it on.
- Meta collects metadata. Who you talk to, when, how often, your phone number, your contacts. Signal collects essentially none of this.
So: Signal if you care deeply about metadata privacy and want to trust the whole stack. WhatsApp if you need to reach your mum, your plumber, your colleagues in another country, or any of the two billion people who already have it installed. Either way, the underlying cryptography is LEAGUES ahead of PGP. And either way, you get forward secrecy, authenticated key exchange, and modern primitives — things PGP has never offered and probably never will.
The perfect is the enemy of the good, and Signal-protocol-based messengers are bloody good.
For Files: Magic Wormhole or age
Need to send a file to someone? Magic Wormhole [15], built by Brian Warner and presented at PyCon 2016 [10], uses a one-time password-authenticated key exchange (PAKE) to negotiate an encrypted transfer. You run wormhole send, it gives you a short human-readable code — something like 7-crossover-clockwork — you tell your recipient the code over whatever channel you like, they run wormhole receive, done.
First time I used it I actually laughed out loud. It’s one of those tools that makes you think “why hasn’t everything always worked like this?” The code is short enough to read over the phone. The transfer is encrypted end-to-end. There’s no account, no key exchange ceremony, no configuration. It just works. Brian, if you ever read this: thank you.
For encrypting files at rest — backups, archives, things you’re keeping on disk — age [18] by Filippo Valsorda is what PGP file encryption would be if PGP didn’t suck. Simple CLI, auditable design, modern primitives (X25519, ChaCha20-Poly1305), implementations in Go and Rust. No keyservers. No key rings. No packet format from hell. Filippo’s written thoughtfully about the authentication properties of age [16] — it’s a system designed with clear boundaries about what it does and doesn’t promise, which is exactly the kind of honest engineering I want from my crypto tools.
For Package Signing: Signify/Minisign
OpenBSD uses Signify for package signing. It’s tiny, it uses Ed25519, and the public keys are short enough to paste mid-sentence in an email.
Minisign from Frank Denis [19] brings the same design to every other platform, with bindings in Go, Rust, Python, Javascript, and .NET. Frank also created libsodium — arguably the most important modern cryptographic library in existence. If you’re building an application that needs to do cryptography, use libsodium. It has a deliberately hard-to-misuse API, it builds everywhere, and it means you never have to shell out to gpg as a subprocess. Which, if you’ve ever tried to parse GnuPG’s output in a script, you’ll know is a special circle of hell.
For Application Encryption: libsodium
I want to pull this out separately because it matters. If you’re a developer and you’re currently shelling out to gpg to encrypt data in your application — stop. Use libsodium [19]. Frank Denis designed it specifically so that non-cryptographers could use it safely. The API is opinionated in the right ways: it picks good defaults, it makes the dangerous options hard to reach, and it works on every platform you’ve ever heard of and several you haven’t.
Ptacek’s team at Latacora [1] recommends it. Soatok recommends it. I recommend it. There is genuinely no good reason to be using PGP as a library for application-level encryption in 2023.
The Steelman
I promised myself I’d be fair about this. I’ve spent a lot of words explaining what’s wrong with PGP, and some of you reading this are people who’ve used PGP for years, who care about privacy, who are trying to do the right thing. I don’t want to be dismissive of that. You’re not wrong to care about encryption. You’re just using the wrong tool.
So let me take the counter-arguments seriously.
“But PGP is battle-tested.” It is. And it’s lost most of those battles. CVEs going back decades. The keyserver poisoning attack in 2019. EFAIL in 2018. The MDC downgrade vulnerability. The O(n²) key parsing. The GnuPG CVE list [1] is extensive. Battle-tested doesn’t mean battle-winning.
“But the Web of Trust is decentralised and doesn’t need a trusted third party.” I hear you. Decentralisation is appealing for all the right reasons. But decentralisation isn’t a virtue if the system doesn’t actually function. In practice — as the HN discussion around GnuPG’s notarial use [17] makes clear — everyone either used a keyserver (centralised, broken), directly exchanged keys (which is TOFU, the thing Signal already does with better UX), or just… didn’t verify at all. The Web of Trust was a beautiful idea. It was never a practical reality outside a very small community.
“But I’ve invested years in building my key’s reputation.” I know. And I’m sorry. Sunk cost is genuinely painful. But your long-term key is as secure as the least secure moment across its entire lifetime. Every machine it’s touched. Every backup that exists. Every time you used it on a laptop that wasn’t fully patched. Valsorda makes this point in [3] and it haunts me a bit.
“But what about at-rest encryption for email archives?” Use age [18]. Or encrypted disk images. Your operating system has full-disk encryption built in — FileVault on Mac, LUKS on Linux, BitLocker on Windows. None of these are perfect, but they work, and they’re a damn sight better than wrestling with gpg --encrypt --armor --recipient and hoping you remembered the right key ID.
“But PGP is the only option for my use case.” Maybe. If you’re doing something very specific — like ACH file encryption for banking, which came up in a Hacker News thread [17] — then yes, you might be stuck with PGP for compliance reasons. That’s not a technical argument in favour of PGP; it’s an argument about institutional inertia, and it’s worth knowing the difference.
Where I End Up
I’m not saying PGP was never important. It was. In 1991, Phil Zimmermann built something that mattered — politically, technically, historically. The US government tried to prosecute him under arms export regulations for giving ordinary people access to strong cryptography. That fight was real and it was important and he was on the right side of it.
But it’s 2023 now. We have better tools for every single thing PGP does, built by people — Moxie, Trevor, Filippo, Brian, Frank, and others — who’ve spent decades learning from PGP’s mistakes. The kind thing to do, the responsible thing to do, isn’t to keep recommending PGP out of nostalgia or habit. It’s to point people towards the tools that actually deliver the security we promised them.
Put your Signal number on your website. Send files with Magic Wormhole. Encrypt archives with age. Use libsodium in your applications. Let PGP rest. It served its purpose. The world moved on.
And the next time someone asks me for my PGP key, I’m going to send them this link, and then go make a cup of tea and stare at the Mediterranean and try not to think about CFB mode.
References
- Latacora — The PGP Problem
- Matthew Green — What’s the Matter with PGP?
- Filippo Valsorda — I’m Giving Up on Long-Term PGP
- Filippo Valsorda — OpenPGP Is Broken
- The Double Ratchet Algorithm — Perrin & Marlinspike
- Trevor Perrin — The Noise Protocol Framework
- Whitten & Tygar — Why Johnny Can’t Encrypt (1999)
- Why Johnny Still, Still Can’t Encrypt (2015)
- Rob Locher — OpenPGP Encryption Using GPG
- Brian Warner — Magic Wormhole: Simple Secure File Transfer (PyCon 2016)
- Moxie Marlinspike — Reflections: The Ecosystem is Moving
- Matthew Green — Was the Efail Disclosure Horribly Screwed Up?
- EFF — Not So Pretty: What You Need to Know About E-Fail
- Trevor Perrin — The Noise Protocol Framework (Real World Crypto 2017)
- Magic Wormhole
- Filippo Valsorda — age and Authentication
- HN — GnuPG for Notarial Acts in Washington State
- age — Simple, Modern File Encryption
- Frank Denis — libsodium, Minisign