The US Gov’t Says Backdoors Are Great For You — But A Serious Security Risk For Them

Nick Selby
6 min readDec 19, 2015

--

Sometimes, when I’m on a plane, while we’re still on the ground, I’ll approach someone in First Class and, summoning all the earnestness at my command, I’ll say, “Hey, how you doing? Listen, you’re using the same text notification sound as I am.” Then I pause. Then I follow up, “So, why don’t you go ahead and change that for me, OK?*”

Like all good jokes, this one needs an explanation. See, it’s funny because it’s so outrageous that I would ask someone to make a change to their personal preferences that my apparent expectation that this is a reasonable thing to do, and that my victim would be polite to go ahead and do it, is funny. Usually, they stare at me with total confusion for a second and then they laugh once they realize that I could not possibly be serious.

I tell you this because I’ve been looking at the government proposals a little while ago, and the FBI’s desperate pleas to insert (through “cooperation”, not legislation) back doors into encryption, and I kind of thought they were kidding at first — in the same way I do on the plane.

“Hey, how you doing? Listen, I see you just bought a house.” Pause. “So why don’t you go ahead and make an extra key for me, OK? Case I ever need to get in and toss the place.”

Note the penultimate line in the document.

It’s even funnier when the FBI starts investigating a back door into an encrypted communications method that it uses, and when government officials say things like the back door is akin to, “stealing a master key to get into any government building.”

Say, that’s a good analogy.

From the beginning, security experts and technologists (myself among them) have been saying very clearly that there is no technical way to make a back door for one party that is insusceptible to any other party. It’s not fun to be right in this case.

Juniper discovered an exploit (actually, it discovered two separate exploits) in their operating system, including a part of it that creates what are referred to as “encrypted tunnels” or “virtual private networks,” during “an internal code review.” They say it’s been going on for some time — as in, a couple of years.

The access this back door to the Juniper code provided not just the ability to see otherwise encrypted things, but also to cover one’s tracks. There are actually two separate issues with the Juniper kit (don’t think that it’s not the case with products from other vendors, or other kinds of products, because basically everything we have is, as my friend Aaron used to say, “Certified Pre-Owned”), but it’s important to note that, according to Juniper, “there is no way to detect” whether such a knowledgable attacker (like the US government, or a criminal group, or foreign-funded nation-state attack group) has accessed the VPN and viewed the encrypted data.

The FBI has not stated whether it is investigating, however the deliciousness of the FBI doggedly tracking these hackers along the digital cyber stream until it empties into an ocean full of irony is not lost on me.

These are the very security holes — impossible to fill — that many of us referred to. It is crucial to note that, when FBI Director James Comey tells Congress that the government must, “continue to ensure that citizens’ legitimate privacy interests can be effectively secured, including through robust technology and legal protections,” it’s simply not technically possible to do that in a world in which back doors are provided to the government.

This is not an opinion. It is fact.

Because of that, beyond the civil liberties concerns, back-doors are a terrible national security issue that must be viewed as such.

There are so many reasons why allowing encryption backdoors are a stupid, blundering, lazy, liberty-sucking, un-American thing to do that I will only talk about one: we don’t need it.

The FBI and other national intelligence agencies arguably do a pretty good job detecting and interrupting terror plots of the 9–11 model, and do an OK to middling job detecting and preventing smaller-scale attacks. Along the way, they’ve also, ah, over-reached quite a few times.

When the statements are made by the FBI Director that, “The harms resulting from the inability of companies to comply with court-ordered surveillance warrants are not abstract, and have very real consequences in different types of criminal and national security investigations,” it sounds like (and is often referred to by the media as), a “Law Enforcement” request.

But the overwhelming, vast majority — nearly total — volume of the work done by law enforcement is unaffected by this. Sure, when police investigate organized retail crime, human trafficking or child sexual exploitation, the job would be substantially easier if we were able to call up Apple and say, “Hey! Y’all want to go ahead and un-encrypt all of Henry’s iPhone and stored files for me? Thanks.”

Easier for cops doesn’t mean it’s a good idea. Easier for cops doesn’t make it a reasonable request. In this country, cops must work for a living (and get yelled at for doing it, I might add). Make no mistake, what the government is discussing here is the removal for 318,000,000 people of the ability to securely protect their communications, for the ease of law enforcement seeking data on a few who wish us true harm.

The harm is real. That there are many who wish us death is real. Yet every day, intelligence analysts and law enforcement officers at the local, county, state and federal level in this country build cases against, and thwart, terrorists. How?

Police work.

Even if Law Enforcement was specifically investigating 100,000 truly bad people a year, this is a request to dilute the security and integrity of communications of 318,000,000 Americans to benefit investigations of 0.03% of the country.

And another thing: Users screw it up even when it’s easy.

Users don’t understand encryption, what it protects or how it is used. The national intelligence agencies count on this.

Why? Because it’s hard. Glenn “Scoop” Greenwald loves to act like he’s some kind of expert, but it is clear to any viewer of citizenfour that, before Ed Snowden showed up and showed him repeatedly how it worked, Greenwald had never heard of, tried or used any encryption except when he bought stuff on websites. That’s why others write the good Encryption How-To guides at the world’s most cynical, profit-seeking, hypocritically civil-liberties-themed site, The Intercept.

When even moralizing tosh-buckets with strong, vested financial interests in getting encryption right get it wrong, the “grave” urgency with which the government seeks back doors is, at best, over-stated.

Politics, of course, make for strange bedfellows, and on this one, I’m with Benedict Greenwald. As an EFF supporter and ACLU member and a police investigator and a technologist, I believe truly that the juice of encryption backdoors isn’t worth the squeeze it places on our civil liberties and our right to communicate, associate, conduct commerce and above all, to express ideas in an environment safe from government oversight.

What we need to solve this is real cyber-security legislation, informed by actual computer- and information-security professionals, those in the offensive and defensive security business, and not by a bunch of “Me-too”-voting people who have personal assistants check their email for them.

Computer security is, you know, hard.

____________________

* Yes, I really do this.

--

--

Nick Selby
Nick Selby

Written by Nick Selby

Fintech Chief Security Officer. Former NYPD apparatchik. Co-author Cyber Attack Survival Manual; In Context: Understanding Police Killings of Unarmed Civilians.

Responses (2)