With the proliferation of mobile phone cameras, the growth of social media, and explosion of electronic communications, one of the biggest civil rights questions this century will be about right to privacy. On one side will be the police and security establishment who will argue that they should be completely private and they should be able to view what everyone else is doing. On the other side will be the ACLU and other privacy advocates, who will argue that privacy should be the default option for everyone, and the government should have to go through extraordinary measures when they want to violate that privacy.
This battle is going on right now in court with Apple versus the FBI. If you remember, back in December 2015, two terrorists shot and killed fourteen people and attempted to detonate a pipe bomb. The couple was killed in a police shootout, but their phones remained. Now the FBI is taking Apple to court in an attempt to get them to write an operating system to allow them to circumvent the security on those iPhones.
I think this is a bad idea and support Apple for standing its ground.
The small stuff
There’s quite a few reasons why this is unreasonable. Let’s start with the less interesting, common sense arguments.
First, it’s a waste of effort. In the decade from 2004 to 2013, 313 Americans died as a result of terrorism. On the other hand, 316,545 Americans died as a result of firearms while on U.S. soil. So, either the FBI, NSA, and CIA are already fantastic at preventing terrorist deaths, or the terrorist problem is way overblown. If these security organizations actually cared about saving American lives, they wouldn’t be trying to eliminate the right to privacy, they’d be going after the second amendment, the right to bear arms.
Second, it’s a bit ridiculous to require a company to work for free, creating an operating system specifically for the FBI and in the process, weakening their business. Operating systems are complex—even small changes are huge amounts of work. And, if it becomes known that Apple’s OSes are less secure than other OSes—as they will be if Apple is deliberately required to weaken iOS—then people who care about security will avoid the iPhone.
The biggest issue
Those two issues should be sufficient, but the biggest issue to me is that once protection of privacy is eliminated, it’s all but impossible to get back. This means that we should be extremely careful in creating precedents that reduce privacy and generalized privacy-infringing tools. Once Pandora’s Box is opened, it can’t be closed again.
Even if the government as a whole can be trusted, the individuals on the ground can and will pervert the original goals. For instance, look at civil forfeiture, where assets can be seized from people suspected of criminal activity without charging the owner with wrongdoing. The police are using these laws to steal money from innocent people. This clearly isn’t even close to the spirit of the original law, yet the police are eager to do what they can to get money, even if it’s unethical.
If the police would go so far over the line with civil forfeiture, why would we believe that, if we give these agencies the ability to spy on people, they won’t abuse it?
And that’s just assuming low-level corruption, as opposed to an organized, deliberate governmental attack. In general, it’s a bad idea creating the apparatus for a police state, even if we don’t live in one today, because things change. At one point, both Syria and Lebanon were wonderful places to live with nice weather, good jobs, friendly people, and modern economies. Now, not so much.
Thus, instead of assuming the government will always be benevolent, think about how the technology would be used by other governments. Take, for instance, the democratically-elected National Socialists in Germany. What would Hitler do with this technology? If the answer is “turn the country into a hell-hole of brutality and persecution”, then you should be cautious about giving that technology to your own government.
The anti-privacy proponents might claim that everything will be fine because courts will watch over the technology to ensure it isn’t abused. But it’s already clear that hidden courts—courts where the public is forbidden to view the proceedings or judgements—are no more than a rubber stamp. Out of the 35,529 FISA applications (court requests for electronic surveillance), only 12 have ever been denied.
And even if we could trust the security complex with this technology, it’s extremely unlikely that the technology would remain there. It would almost certainly get out of their hands—either through corruption or hacking—simply because the rewards for getting such technology are so high.
Why is the FBI trying?
One of the ironies of this case is that the FBI has hacked phones before and has claimed to be capable of hacking iPhones. So, if that’s the case, then why would they bother with this court case?
My guess is that the information on the phone isn’t actually valuable for avoiding future terrorist incidents. Instead, the case is extremely valuable for moving the line, established a precedent in court that could be used far beyond this case.
The FBI loves that this is a terrorist case. They love that people are scared of terrorist attacks, because they can leverage that fear in the PR battle. They can imply that anyone who opposes them is soft on terrorism, doesn’t care about the people who were killed, and is willing to leave America open for another terrorist attack. If they can get the public on their side, it will help get the courts on their side.
Then, once they win the case, they’ve got a precedent. And they’ll use that precedent in all sorts of other court cases, not just related to terrorism, but anything illegal. Their argument will be “we suspect that guy might be bad, so we need to spy on him, and the Apple case means you have to help us.”
And argument will soon morph into “any device you sell must have a back door allowing our spy agencies to peer inside it.” In effect, this would mean any device will be easily hackable to anyone who would care to try.
You might think that FBI would never go so far, to completely destroy any notion of privacy on electronic devices. But I don’t think it is. The security agencies are already requiring back doors into cloud information repositories such as Google and Amazon. Why would they stop there?
The bottom line
Thus, I think this is a very important court case for the right to privacy. If the FBI wins, I think there’s a reasonable chance that within a decade, individuals will have no expectation of privacy on anything electronic. Not their computers, not their phones, not the online storage. It’ll all be wide open for the government to see.
And if that happens, you better hope that the government—and everyone else in the world—remains benevolent for your entire lifetime.
2 thoughts on “Apple versus the FBI”
I concur with your line of reasoning but do wonder what ‘privacy’ exists in this world of data acquisition. Maybe an article on ‘privacy’ in the world of the cloud?
And how ironic that almost every website, no matter how small, wants us to input a special password when they hardly need it!