This post is part of "Apple v. FBI," a series of 6 posts. You can see all posts in the series.

Apple is in the news for the unprecedented order issued by a Federal Magistrate Judge in California this week. The order compels Apple to assist the government in hacking into a terrorist’s phone. Apple publicly refuses to comply, and every one of us ought to be standing by Apple and supporting its decision. What Apple is being asked to do will ultimately undermine every individual’s civil liberties and quite likely put national security at risk.

The order goes beyond requiring Apple to “assist law enforcement.” Read it in full and its far-reaching implications become clear. Apple is being ordered to create new technology, something that does not at this time exist, to undermine its privacy-protecting encryption. In other words, it is being ordered to create a master key to hack any iPhone on the planet.

Such an order is well beyond the scope of reason, and what the court is demanding Apple to do will ultimately undermine any hope of any of us ever having any privacy in the digital age.

This is not an exaggeration.

Remember that after September 11, it seemed like a good idea to go after terrorists by any means necessary, including listening to their phone calls, intruding on their privacy in any way necessary to stop them. Why should terrorists have any right to privacy? But when that idea was applied in reality, none of us retained our privacy. Edward Snowden’s revelations opened our eyes to the fact that we had given up all of our own protections in the name of fighting terrorism.

What Apple is being asked to do is not just to provide law enforcement with information helpful to understanding December’s terror attack and preventing others. Standing alone, that sounds like an honorable request. In reality, Apple is being asked to take its solid encryption product, a product on which millions of iPhone users rely (whether they realize it or not) to protect their privacy, and undo it. Apple is being asked to create technology to undermine the very encryption that currently protects us from government intrusion.

It is easy to oversimply this situation and rationalize that this order only pertains to one specific phone, and no one wants to stand up and argue that this terrorist’s privacy rights should not be violated. In fact, this particular terrorist was not even using his own phone, but rather one owned by his employer, so he really could not have expected any privacy on the phone. So, why not go ahead and have Apple break the encryption on this one phone?

That might be a reasonable argument if what Apple was being required to do was break the four digit lock code on this particular person’s phone—something Apple already has the means to do, and it is akin to a landlord providing a key to a rented apartment.

That’s just not the case here. The court did not order Apple to turn over the front door key that it already has; it ordered Apple to create a key to a door that currently has no keyhole. It ordered Apple to make something new to undermine the lock we all have in place. As Apple said in its public statement that it would defy this order, “The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices.”

Apple is absolutely correct. Once it creates the tool to break the encryption of any iPhone, that tool will be used again and again. At some point, it will become commonplace and accepted. None of us will expect that our phones are secure from government intrusion. Start down this slippery slope, and never again will we believe in any form of digital privacy.

Finally, if this order stands, what is next? As Apple notes in its statement, who is to say that the government would not next order it to “build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone’s microphone or camera without your knowledge”? There is no assurance whatsoever that cracking the door open will not make it swing right off its hinges.

Gaining access to this one terrorist’s secrets is simply not worth the massive downside. There is always information law enforcement cannot get to, and while the digital age has made it harder to hide our secrets, requiring Apple to unlock the contents of this phone is another step toward unraveling the fabric of the American system. Every lawyer should be standing by Apple and supporting its decision to defy this order.

Featured image: “encryption” from Shutterstock.

Read the next post in this series: "."


  1. sunshipballoons says:

    My reaction is that this would be problematic even without the privacy issue (which, too, is problematic). Can the FBI get a court order requiring Smith & Wesson to produce a gun that would be helpful in arresting criminals? Requiring Chevy to build a car that would help them in car chases? That would require Apple to build them a special FBiPhone? Since when can the government conscript corporations in this manner?

    • Megan Zavieh says:

      You are precisely right. When would it end? A court thinks up an idea that normally private enterprise would develop, patent, and profit from, but instead the court orders private industry to create a product for law enforcement (or some other entity of the court’s choosing) to use?

      • Paul Marks says:

        Kudos to Megan. I predict this court order will have the half-life of a cheap tattoo. Most curious is paragraph 5 of the order, requiring Apple to apprise the FBI of the reasonable cost of the software development — without any indication of who is to pay. “Hey, FBI, here’s an invoice for $12,000,000 in development costs we diverted from paying customers. Let me know when the check is in the mail, will ya? Love, Apple.”

      • John Eversole says:

        We can only hope that law enforcement can use all tools available to deter and arrest terrorist scum. I will gladly give up my cell phone “privacy”. And don’t give the first year law student slippery slope fallacy.

        • Sam Glover says:

          There’s no slippery slope here. This is binary. There is no iPhone skeleton key now. The FBI wants Apple to create one. Once it exists, you can’t make it disappear.

          There are certainly people who think as you do, but I hope liberty (and Apple) prevails in this case.

  2. theprez98 (??) says:

    When lawyers can’t get the facts straight, we’re doomed.

  3. John Eversole says:

    Apple is dead wrong…privacy blah blah blah- this lawyer thinks that saving lives is more important than any slippery slope nonsense privacy claim.

    • Paul Marks says:

      On the continuum between privacy and security, I may well be at or near Mr. Ebersole’s position. I am not sure I would go so far as to say, “privacy, blah blah blah,” but the PATRIOT Act has never particularly bothered me. Instead, I view this as a liberty issue more than a privacy v. security issue. The slippery slope in question is not eroding privacy interests, but how far the government can force private parties to expend their resources in aid of government. (By the way, my earlier $12,000,000 estimate was probably orders or magnitude too low — this is software that has never been developed before. I’m no coder, but I don’t think you crank this stuff out overnight. Here are some things never developed before that were later developed: Tesla; iPad; Space Shuttle. How many hundreds of millions or billions in development costs there?) There are many nuances here, but if it is critically important to national security to unlock this iPhone, then isn’t it just as critically important to national security for the FBI to win the legal battle against Apple, so as to require the software to be developed? Makes sense to me. So, can the FBI involuntarily conscript the foremost legal expert in this area to help with its legal briefs, for no pay? What’s the difference?

      • Paul Marks says:

        Further thought: If Apple spends $50,000,000 in development costs but really and truly cannot develop software that can unlock this iPhone, what is the court to do? Force Apple to spend $50,000,000 more?

    • Megan Zavieh says:

      This is so much more than a privacy issue. The privacy issue is certainly concerning, and it is really disturbing to read the DOJ’s motion to compel compliance with the order (filed February 19) in which the DOJ sounds appalled that the government is not in control of what information is available to it. That very attitude, that the government is to be in control of all information, is highly disturbing.

      What this order does, though, is implicate the rights of citizens and entities to go about their lives without government interference. That freedom from government interference is critical to the fabric of American life. I am not talking about the barely-noticed government intrusion of NSA hacks, even; I am talking about the type of government intrusion where an FBI agent knocks on your door at 8 am and says you aren’t going to your regular work today, you aren’t picking up your kids from school today, because you’re going with him to work on something the government has decided you need to do instead of your real life.

      That is the slippery slope. When the government has an idea of a tool it would like to have but that tool does not exist, it is not in keeping with American ideals or jurisprudence to allow the government to find someone capable of making its desired tool and get a court to order that person to create it. That is involuntary servitude, and we have laws against that.

      Plus, as Mr. Marks points out in his spot-on comments, who is to pay for this diverted innovation? Are we to expect the government to take over our income-producing time and divert it to government use without compensation? The order from earlier this week does not contain a provision that Apple will be paid for the (multi-million dollar?) cost of creating this new technology. And what of Apple’s shareholders who will be harmed by this unforeseen cost to Apple?

      Finally, there is no indication from the DOJ that any lives are going to be saved by Apple’s compliance with this order. In its motion to compel, the DOJ says that it believes the phone contains communications between the terrorist and his victims on the day of the attack. While this may help the FBI piece together what happened that day, there is no indication in the DOJ’s motion that the information it expects to uncover will assist in preventing another terror attack. If the DOJ had a basis to argue that the information would somehow protect the public, there is no doubt such a compelling reason would have been included in its filing. The DOJ’s silence is deafening.

  4. FreetoHide says:

    Apple needs to comply. This is a court order with the proper procedure and with the proper protections for the privacy of all, except those involved in the investigation.
    The government can force you to do all number of things, remember? Pay child support, produce documents, refrain from approaching people or places, and can also take away your freedom from refusing to comply with its orders.
    Zuckerberg said it, privacy is a myth. And after all, if you have nothing to hide, why are you worried about your privacy? You gave all your information to facebook, why not the government?

  5. April King says:

    Thanks, Megan. I’m another step closer to becoming an Apple person after this.

  6. Gasbird says:

    F.Y.I., the site to sign a petition at the White House site is below:

Leave a Reply