How to Build Truly Private Email in Ten Easy Steps

This guide aims to demonstrate how easy it is to set up truly private email in ten steps.

First, some assumptions:

  • Cryptography must always be available, in at least some form, in order for any of this to work. If all forms of cryptography fail us, then no one will be able to write anything private, digitally or otherwise, ever again.

  • The aim of this document is to protect against automated digital mass surveillance, not against physical intrusions with access to both the hardware and the user. Our best defense against the rubber hose attack only works when the physical authentication point itself is secure.

  • Whether or not this process is (and continues to be) legal is not relevant to this argument. See previous point.

Now on to the methods:

  1. The email client must be native software, not a web page downloaded over the internet. As we have seen with Lavabit1, a web service could be compelled to add an exploit to its web client, therefore rendering any security useless. This is true no matter where the server is located. Germany is no more secure than the USA.

  2. The email client must be open source software, to allow it to be verified independently by third parties. Each proposed patch must be made publicly available for analysis by the community before acceptance into the source code tree. At each release, any developers who contributed patches to the release must sign the release with their public key to certify that the source code contains no security flaws. The list of signatures and public keys will be made available online.

  3. The email client must be accompanied by a formal proof of correctness written in a system such as Coq. The proof verification system itself must also be open source and available for verification by third parties.

  4. The email client must be compiled with a provably correct compiler such as CompCert. The compiler must also be open source so that its correctness can be verified by third parties. The compiler must be able to generate a binary which is bit-for-bit identical to itself, in order to prove that original binary is valid. The proof system must also be compiled under the certified compiler in order to prove that it is also valid.

  5. The entire software stack, including the OS kernel, device drivers, and any other software running on the same system with sufficient privileges to look into the email client process’s memory, must be open source, so as to be verifiable by third parties, and must also be accompanied by a formal proof of correctness.

  6. The entire hardware stack, including the CPU, RAM, storage devices, and any other hardware with access to the system bus where data is exchanged between the CPU and other subsystems, must be open source, so as to be verifiable by third parties, and must also be accompanied by a formal proof of correctness. The user is expected to own a scanning electron microscope so that they can verify that the circuit diagrams match up with the circuits printed on the actual hardware. And of course the scanning electron microscope software and hardware must also be open source, with an accompanying formal proof to verify correctness.

  7. The email client generates a key pair for the user. The private key is never sent to the email server. The public key must be distributed to any users who wish to send email to the user, for example via a QR code on a business card. All email from Alice to Bob must be encrypted first with Alice’s private key, then with Bob’s public key. The email can only be decrypted by Alice’s public key and Bob’s private key.

  8. The user’s hard drive must be encrypted with a randomly generated password of at least 64 bytes in length. The user is expected to memorize this password and never write it down or reuse it for any other machine or service, ever.

  9. The user’s machine must boot off a bootloader signed by a public key physically stamped into the hardware in a way which cannot be altered in any way, by the user or anyone else. Each stage of the boot process must validate the next stage to verify that no component in the boot chain has been modified. As a result, the hardware must be perfect by design, because it cannot be updated at a later time. Preferably, the hardware TPM should be tamper proof, or failing that, tamper resistant with a seal that will erase all data on the machine if broken.

  10. Use any email service. IMAP, POP, Exchange all work. Don’t worry about encrypting the connection to the server, the email is already encrypted. Done.

Sounds easy, right?

P.S. Oh, you wanted anonymity? Sorry, that’s not possible. If all the links are compromised, Tor doesn’t work too well.

P.P.S. Also, make sure the cryptosystem is flawless by design. After all, storing the messages indefinitely isn’t hard, and waiting until the cryptosystem breaks is an entirely feasible method of defeating the encryption.

P.P.P.S. And while you’re at it, design an encoding where the encrypted bytes are rearranged to form random words in the English dictionary, and use Google’s n-gram distributions to make them indistinguishable from non-encrypted content. A few random images and charts would help to round out the content.

(P.)4S. Technically, the email client can be a web app, but you need a local signature for the app, which of course must be open source and include a formal proof of correctness. And the browser used for running the web app must also be open source and include a formal proof. In addition, the browser must include information-flow analysis to prove that any Javascript won’t leak your private data to the internet. Preferably, Javascript would be replaced with a language with a static type system which guarantees such properties, such as Haskell.

(P.)5S. This document reflects no beliefs, not even mine, and most certainly not the beliefs of any past, present, or future employer.

(P.)6S. Yes, this is a joke.


  1. This assumption is derived from public statements made by Lavabit, and has in no way been verified as the actual truth. That said, the point about web services being inherently insecure still stands irrespective of what actually happened to Lavabit.↩︎