Last week was a big week in the world of cyber security news, with a potentially landmark bombshell dropping. Apple CEO Tim Cook’s open letter to the public, outlining why they’re going to fight the FBI’s court order to break into a customer’s phone brought the public into the debate over encryption and government oversight.
The headlines this Tuesday caught me off guard. Bill Gates sides with FBI and Gates breaks ranks over FBI Apple request were not things I expected to read. But, as his comments were clarified throughout the day, I saw a sentiment that I can agree with 100 percent. This is a discussion we need to have.
If you read Tim Cook’s letter, he’s not unsympathetic to the FBI’s cause. If the data is readily available and there’s a clear case of criminal wrongdoing, or in this case, terrorism, then Apple is more than willing to help. The problem here is that Apple has designed a very good security system. By intention, even Apple does not have access to their user’s data. They want your data to be private, and this actually is good security.
What the government is asking Apple to do is create a special version of iOS firmware that purposely bypasses some of the baked-in security features. For instance, there’s a feature of the device that will wipe it if a password is entered wrong more than 10 times. There’s also a feature with a passcode that throttles attempts. If you enter the wrong passcode more than four times, you may have to wait before you can enter it again. All of these features are meant to counter brute force attacks, meaning if a criminal steals your phone he could create a rig that could try thousands of passcodes per hour and eventually get all of your data from your phone (and in this FBI case, they are also asking Apple to create additional code that allows them to electronically enter passcodes, which allows them to crack them much faster). So really what the government is asking is to make it easier to brute force this phone.
As you may have seen, the government is leveraging an old law, called the All Writs Act of 1789 to compel Apple to comply. I am not a legal scholar, but in my understanding, the All Writs Act is sort of a catch-all act that was designed to allow courts to legally order an organization to do something when there really is no law around the particular issue.
So what’s the problem here?
I think we can all agree that any reasonable human being wants to help the authorities, especially in a clear case like this where there’s a known terrorist, and they are just trying to protect the public from more attacks. In fact, this dead terrorist not longer has any real right to his privacy. As I mentioned, Cook acknowledges that in his letter. But in this case, what the FBI is asking is for Apple to create a weakened version of their product that can help the FBI crack the iPhone. Of course, the U.S. government says that they only intend to use this weakened firmware for good. Maybe even in just this one case.
But Apple’s argument, which is one I agree with, is that once you release weakened firmware into the world, you have no control over how it’s being used, or whether or not other authorities will continue to make similar versions for other phones in other cases that are less clear-cut. Even if the intentions are good in this case, if this falls into the wrong hands, you’re weakening everyone’s security and privacy. While a dead terrorist has no right to privacy, we do.
This is becoming a repeated topic of concern between governments and security vendors. This time, it’s trying to brute-force the passcode, but the argument also applies to governments hoping to maintain some “master key” for public encryption solutions.
The ends don’t always justify the means
Personally, I side with Apple. No matter the means, purposely weakening consumer security will do more harm in the long-run than any sort of gain made for the investigation in the short-term. Furthermore, there’s not only little evidence that having back doors would help the government stop criminals or catch terrorists, there’s evidence to the contrary.
Bruce Schneier is a well-known cryptographer who took part in two recent studies. The first looked at whether it would really help if the government had backdoors in domestic encryption products. They found that there are hundreds of other non-domestic encryption products that terrorist can turn to, and still protect their communication.
The second study looked at criminals’ communications “going dark.” What they found is that plenty of communication still happens in clear text. There’s plenty of metadata and other data that can still help investigators track down this kind of digital crime.
So really, it shouldn’t be necessary for governments to break consumer encryption and consumer security just to catch criminals. There are plenty of other investigation options that are still open, and breaking encryption just weakens the entire state of the internet.
In any case, it’s a very interesting security conundrum, and kind of a gray area. I agree with Apple and I think Cook lays out a very compelling argument. But what is really important here is that our entire community, all of the citizens of the world have a transparent and public discussion about this. We’re going to see more and more situations where governments are going to try to push the envelope as far as computer defense and computer security.
Besides trying to get their hooks into security, we’ve also seen governments go on the offensive. If you’ve followed information security the past 5 or so years, you’ve seen plenty of evidence showing governments are creating “red teams” trained to launch computer and network attacks. You’ve seen the details about Operation Olympic Games, experts have analyzed Stuxnet, you’ve followed the Snowden leaks, you’ve seen government cyber budgets expand, and most recently, you’ve probably heard Ukraine accuse another country of attacking its critical infrastructure.
I believe that if governments really want to keep their citizens safe, they need to focus more on defense than offense. Offensive cybersecurity tactics only offer short-term benefits but have long-term consequences. A government focused on offense is motivated to hide vulnerabilities for later exploit. This puts every citizen at risk, as bad actors will surely find these holes too.
Are the governments considering launching such attacks really prepared to defend themselves from these same attacks? The short answer is no. Even the former director of the CIA and NSA says that we’re not prepared. In fact, with these calls to create “backdoors” and encryption master keys, they’re actively tearing down our defenses, thus making everyone’s problem worse.
In my opinion, the ends don’t always justify the means. If the means include citizens of a free democracy sacrificing privacy, freedom, and security all for the sake of some vague idea of safety that governments can never really deliver on, I say to heck with those means.
Rather, if governments are really serious about our digital security, they need to get serious about information security. They should spend their time making Apple, and all other public and private vendors’ security features stronger; they should create unbreakable encryption that protects all citizens’ communications; and they should find and plug every zero-day vulnerability they can, so no terrorist or nation state can leverage it to gain asymmetric power over others. Stuxnet opened the Pandora’s box of the cyber arms race. If we want to close that box, we should focus less on the arms and more on building better armor.
There’s my reasoning, what do you think?