We decided to do something sort of (but not especially) experimental with the blog–a conversation post on some topics we can agree to discuss. Our first topic is the case of Apple v. FBI.
I will say upfront that none among us is an engineer or trained lawyer (and you can quickly see why these people have their value). However, one of us is a public policy generalist; another works in technology in the Bay Area; yet another is an aspiring lawyer; and I have something of a technical background. Without further ado:
R.S.: So the FBI for all its power cannot make its way into the data of this iPhone owned by one of the San Bernardino shooters, and Apple refuses to create a “backdoor” to the device’s password protection software on the grounds of privacy. This has become a debate of security versus privacy, though some argue it isn’t at all. The media has covered this head to foot, so I was a bit wary of taking on this topic, but I have faith in us to say something new.
I personally am a bit torn. I do see Apple’s point that creating a backdoor is dangerous and would compromise the privacy not just of terrorists and would-be terrorists, but of every user of an iPhone (not me, a forever Android user whose phone is easily hackable); at the same time, it’s hard to tell the FBI that they can’t have what could be important data on a device that may have been crucial to planning of the attack. After all, the attackers in Paris communicated via WhatsApp, so technology and social media is becoming more and more central to counterterrorism efforts. Apple wants to prevent a slippery slope of government requests for individuals’ private data (and a degradation of its brand), while the government, though Director Comey claims this is a one-time thing, needs to be able to work with the secrets of this technology in the long term.
Maybe I can be swayed one way or the other. What do you guys think?
N.B.: Apple, as a profit-seeker, is doing what it can to protect its business model. I mean…they’ve been able to sell six and a half versions of roughly the same product (says this owner of the iPhone 5). Why have they been able to do this? They are the epitome of simplicity in tech. On top of that, and more central to this topic, people buy Apple products because they feel more secure from hackers. For your average person (yes Android lovers, I am identifying you as not a part of the norm), simplicity and security are paramount as we move towards further integration of the Internet of Things. As a side note, it would be interesting to see how Apple’s business pitch would change if they are forced to open a backdoor. They would still hold the simplicity vote over Androids, but would no longer hold the “good faith” security vote.
Allowing the FBI through a backdoor creates a precedent example of bypassing Apple’s strong security that opens the future to universal enabling of backdoors to all Apple devices. Michael Hayden, ex head of the NSA and CIA, advocates for end-to-end security. He thinks, in this era of cyber-connectivity, Americans are safer with that limitation. He readily admits that were that backdoor available, and were he still head of the NSA, he would absolutely make use of the backdoor whenever he saw fit.
From Hayden’s perspective, it is important to truly define and lay out what is being asked for by the FBI from Apple. I recently listened to him in an interview with Fahreed Zakaria in which he laid that out. Simply put, he feels that the FBI doesn’t entirely know what it is asking for and needs to better define it.
J.S.: In the past few days the government filed a motion to comply and Apple filed a motion to vacate that order. After glancing through the arguments laid out in the FBI’s motion to compel and Apple’s subsequent motion, I have noticed a few things. The only real important question we’re all trying to answer is whether or not the government has the authority to order Apple to unlock the phone based on previous laws or precedent.
Apple’s brief was colorfully written and a powerful pro-security document. I mean, just read the first two sentences.
This is not the case about one isolated iPhone. Rather, this is the case about the Department of Justice and the FBI seeking through the courts a dangerous power that Congress and the American people have withheld: the ability to force companies like Apple to undermine the basic security and privacy interests of hundreds and millions of individuals around the globe.
This definitely was written for the public to see. Regardless, the argument Apple makes throughout the brief is simple: if the government can force Apple to unlock this particular iPhone, it opens up potential (or probable) privacy violations for other Americans. This is a great policy argument. Unfortunately, it’s not really a legal argument on the authority the government has in this case.
Apple makes a few specific precedent arguments, which are a bit complex and would take me a long time to sift through them all. They rely on a close reading of a wiretapping statute from 1994, an interpretation of a 1977 case on the interpretation of the All Writs Act of 1789, and questions of First and Fifth Amendments theories of law.
I’m just going to talk about the second argument briefly since it was mentioned in Tim Cook’s letter to his customers and the FBI essentially relies on the All Writs Act in their motion. This act essentially gives courts the authority to issue any legal order they need to do their jobs. Yes. I agree. It’s EXTREMELY broad and all-encompassing. The Apple brief goes through the three-part test used in the U.S. v. New York Telephone Co. 1977 case to enumerate the reasons they shouldn’t comply. This is probably their strongest argument against the government. However, I don’t see a magistrate judge relying on Apple’s interpretation of the All Writs Act but defaulting to the authority of the Act itself. But who knows, maybe I’m not giving Tim Cook’s lawyers enough credit.
My favorite argument in the brief has to be their First Amendment argument. Code is speech and speech is free. I really hope the judge rules in favor of Apple on this argument.
After reading through the brief and sifting through the strength of the arguments I think that Apple will lose their motion to vacate. They make some incredibly powerful policy arguments but the jurisprudence and precedent is on the government’s side. The motion itself shows how dedicated Apple is to protecting their encryption code. I see Apple taking this case all the way up to the Supreme Court, the venue where these policy battles are finally decided.
Unfortunately, the law has not caught up to technological advances such as encryption codes, backdoors, and cloud backups. This gap means the judicial branch must interpret 21st century cases with 18th century laws. This forces both sides to twist somewhat irrelevant interpretations of cases to match their argument, which leads to an extremely confusing gray area. Until the law can catch up to technology we’re going to be seeing a clash like the Apple/FBI battle far more often.
J.M. The big debate: Apple vs. FBI. Privacy vs. Big Brother. I understand that if the American government decided to pressure Apple into helping with “hacking” the phone then that would give the government the right to pressure any other tech company to do the same. Google would have to submit to the government in hacking to Gmail accounts, Oracle would have to break into their own servers to release client info, and Dropbox would have to surrender their user’s backed up data. It is the key to a Pandora’s box.
However, where do we draw the line of protecting an individual’s privacy to protecting the lives of the masses? Hacking the iPhone could potentially uncover necessary and useful info for the government and help take down ISIS, and even prevent further domestic terrorist attacks. Keep in mind that the San Bernardino shooting was an example of the domestic terrorism we’re trying to prevent. An attack by homegrown terrorists is not a possibility, but a reality.
R.S.: I think I have a problem with the argument that it will be for this “one time.” There’s a scene in The Dark Knight this reminds me of–croaky Batman unveils an enormous surveillance computer that listens to every cell phone in Gotham for the purpose of locating the Joker. Morgan Freedman’s character protests that it’s too much power with no oversight, but when the job is done, Batman allows the computer to self-destruct. I’m not sure if anyone on Earth is so responsible, nor does the film address the fact that once he’s created it, there’s nothing stopping him from creating it again.
But both Jasmine and Jia have mentioned something important about dealing with this issue in the long term; one aspect is legal–that the judicial branch must interpret 18th century law for the 21st century–and another is technological, in that terrorist groups are using this technology to further their aims (e.g. propaganda by means of social media) as well as to conduct operations. The FBI and other agencies conducting counterterrorism do need a solution to this problem, because gone are the days when human intelligence alone or even looking through obscure websites and monitoring web communications would be enough to discover credible threats.
The problem is that when any one of us can pose a threat and that any one of us can possess an iPhone, the government may need some way to access the information on that technology in the event that an individual is doing harm.
The problem simultaneously is that any one of us can pose a threat. This goes back to the slippery-slope argument: If the government can start snooping on a suspect’s phone, they are only so many steps away from snooping on an innocent individual’s phone.
Furthermore, Nick raises a good point about a long-run issue, the further integration of the “Internet of Things.” As technology becomes more integrated–as computers begin to determine not just whether your text sent but also whether your fridge is running the right temperature or your car is going at the most eco-friendly speed–it is not good practice to start putting in these security backdoors. If the FBI can hack into them, certainly other, less savory groups will try to make their way in as well.
Moreover, as Jasmine said, technology is evolving faster than the law (which is not new–I believe the railroads were down and the telephone lines were up before courts could rule on issues of eminent domain, privacy, etc.); the argument that code is speech is interesting. Having programmed myself, I agree with the statement that a line of code expresses an idea, just as a collection of these lines of code–i.e., a program–expresses a bigger idea, performs a function, and potentially becomes intellectual property. Where this breaks down, I think, is where said speech becomes a threat to public safety.
So, play Supreme Court for me. Should the Bureau leave this one alone? Or should Apple submit for the good of national security?
J.M. Let’s play a hypothetical situation. Let’s say a young woman happens to be at the corner of 16th & Broadway, in a parking lot of the Goodwill in San Diego on February 6th. Also, let’s consider the scenario of a pro-rape group assaulting her and these rapists happen to use password-protected iPhones to coordinate their sick event. What if the proper authorities managed to get their hands on one of the devices, but they can’t “hack” into the phones? Based solely under the principle of Tim Cook and the courts they cannot hack into these phones because these rapists have the right to privacy. Out of principle, Apple would not be able to potentially apprehend these criminals nor possibly prevent other rapes from happening. On February 6th we came close to something like that actually happening: A pro-rape group was about to hold a global event promoting their agenda. I can’t say for Tim Cook, but I know if I had a daughter I would literally pull a Liam Neeson and hunt down these people regardless of what law (or neck) I would have to break.
A New York judge has ruled in favor of Apple, exempting them from assisting the FBI. Pretty much Apple is now indirectly withholding potential evidence from surfacing. Apple has reportedly been working on making their devices unhackable even to themselves. It’s been something in the works before the San Bernardino shootings. With Apple making their own products unhackable to themselves, what’s to stop other organizations from switching to this platform, knowing that a court has ruled in favor of protecting their privacy? If my moral compass was shifted enough I would honestly invest in Apple products and carry on my illegal activities until someone actually decides to do something to stop me.
From the looks of things both parties, tech companies and, refuse to reach a middle ground for a solution. Thus, my question from before stands: At what point do we draw the line of protecting an individual’s right over protecting the masses?
N.B.: Back to the idea of a court ruling, yet another situation where the current state of the Supreme Court may delay a decision on a pressing matter. There is no way of knowing if a decision on this will go 4-4, but the Senate GOP’s refusal to hear of any replacement for recently deceased Justice Scalia would delay a Supreme Court decision or have it deferred to a lower federal court.
As timeless as James Madison’s words are…
“If men were angels, no government would be necessary. If angels were to govern men, neither external nor internal controls on government would be necessary. In framing a government which is to be administered by men over men, the great difficulty lies in this: You must first enable the government to control the governed; and in the next place, oblige it to control itself.” – Federalist No. 51
Jasmine points out that the law is a little behind technology in the situation of the 4th Amendment.
With that being said, I offer an alternative…
Under the USA Freedom Act (rooting from the Patriot Act), the NSA is able to access phone data with permission from a federal court. Who is to say this wouldn’t be used in a similar way for the FBI to track lone wolves? Obviously an iPhone has far more capabilities than phone records, but this language could serve as a useful template for a decision in this situation that limits the ability of the FBI to invade Americans’ individual privacy while providing an outlet to let them have their backdoor that can only be opened with approval.
However, Section 215 of that law will be changed to stop the NSA from continuing its mass phone data collection program. Instead, phone companies will retain the data and the NSA can obtain information about targeted individuals with permission from a federal court.
Apple would likely not enjoy this option, but it could be a closed-enough door for them to keep a good face (by putting makeup on a black eye) and talk about their role in cooperating with national security efforts. I can see the “in an ever-growing, interconnected world, we must adjust with the times and at this time we realize that cyber security is a priority and we are not above the law of the land” statement that Tim Cook is forced to utter.
J.S.: As Jia mentioned, in the past day the court ruled in favor of Apple and denied the government’s motion to compel. I guess I really did underestimate the strength of Apple’s argument. Essentially, the court believed the government’s order was too burdensome as part of the three-part test mentioned in the 1977 case above.
This issue is something people feel strongly about. I see this on social media. I saw this in Apple’s motion and the government’s motion. Security and privacy are issues that just seem to be mutually exclusive as we become more technologically advanced.
Personally, I tend to be more critical of the government. I just don’t find it believable that they’ll use a backdoor for one situation or in emergency situations. They would probably overuse and misuse that power. And I like my privacy. Maybe reading 1984 at a young age has shaped my frame of reference.
For now I think the best thing we can do is to find alternatives. I agree with Nick in that there needs to be a legal limitation in place. One popular alternative is for Apple to increase our iCloud storage (please, Tim Cook, I need more photo storage space) and backup potential so the government can just subpoena for the backups instead of using the backdoor encryption code. Apple has stated that they would be willing to hand over the suspect’s backup data.
Something else worth mentioning is what I mentioned previously and Richard expanded on. We must have up-to-date laws in order to eradicate this gray area. So I’m going to hope that if this case ends up going to the Supreme Court, Congress will finally pass an adequate law that addresses the limitations and boundaries of private companies and the government instead of relying on a statute from 1789.
But, as Jia has mentioned, this all depends on the willingness of the government and tech companies to find a middle ground and compromise. Both sides so fervently believe in their own stance that this seems unlikely. And if this continues without compromise the courts will just continue to mediate and issue judgments until someone can find a way to finally protect our individual liberties and the safety of the American people.
R.S. Well said. While I’m sure not unusual, it’s bothersome that one week a federal judge gives the Bureau a go-ahead, and then another, in albeit a different case, a judge rules that such a request is too burdensome, and moreover that Congress should make the call on the appropriate compromise. Looks like the battle just moved there.
Well, thank you guys for your input. It is good for us to watch and discuss these things even if we’re not on the bench or in the Capitol.