WhatsApp has gained notoriety as one of the most securely encrypted messaging services, allowing conversations to remain private and away from the watchful eyes of hackers, including those employed by the government.
Of course, encryption is a tool that, like any other, can be used for evil as well as good, and following an attack by an ISIS-inspired terrorist in London, government authorities are crying foul, demanding that the developer of WhatsApp modify its system to allow access to law enforcement.
This is essentially the same argument that the U.S. government had with Apple in the wake of the 2015 San Bernardino shootings — when the FBI demanded that the company write software to grant access to its smartphones, and Apple refused. In both cases, the request sounds reasonable on the surface. Why not just let the government see what known terrorists are up to? Isn’t a refusal tantamount to harboring criminals, willfully protecting their illegal activities from detection?
The answer is, “No, it isn’t.” The reason the request sounds reasonable is that most people don’t understand how encryption really works, and why it’s not a simple matter to grant access to certain individuals without inadvertently opening the door to everyone else.
The best encryption uses what’s called the end-to-end method, where messages are only visible to the intended recipient. In this case, the manufacturer or developer has no direct involvement in the encryption process, and couldn’t read the message even if they wanted to. It’s the safest way to communicate privately, and the reputation of the developer depends on it remaining safe. This is the kind of encryption WhatsApp uses.
The only way, then, for WhatsApp to comply with the demands of the British government would be to weaken their encryption process, putting in a backdoor that deliberately makes messages unsecure. But if a backdoor exists, it will not only be available for one government to exploit; it will be available for all governments, as well as independent hackers, to exploit. A message is either secure or insecure — it can’t be both. What the British government is asking is for a company to abandon the whole point of its business model, and instead sell software vulnerable to spying.
U.K. Home Secretary Amber Rudd objects to secure encryption, and had this to say in a statement: “We need to make sure that organizations like WhatsApp — and there are plenty of others like that — don’t provide a secret place for terrorists to communicate with each other.”
It’s impossible to say whether she has thought through the implications of that statement, but the rest of us certainly should. In essence, what she is saying is that there should be no privacy whatsoever — that the government should have access to literally every conversation that goes on in the world, and that no citizen should be safe from the all-seeing eyes of the state. What other conclusion can be drawn from pronouncements like this?
It’s also worth noting that secure encryption is an absolute necessity for e-commerce. The only way people will send bank account and credit card information over the net is if they can be confident that hackers — governmental or otherwise — don’t have access to that data. In a world without encryption (the world Ms. Rudd apparently wants), the enormously productive online economy would be impossible.
At the root of this conflict between law enforcement and tech companies is a deep-seated technological illiteracy. Government officials simply don’t understand that you cannot have your cake and eat it, too — that you can’t have secure communications “except for the bad guys.” If they did, they would understand that what they are asking for is not only a violation of individual rights to privacy, but fundamentally destructive to the entire future of the information age.
This article originally appeared on Conservative Review.