Should Apple comply with an order to unlock a San Bernadino shooter’s iPhone?

“In a first-of-its-kind ruling, a U.S. magistrate has ordered Apple to assist the government in unlocking the iPhone of San Bernardino shooter Syed Rizwan Farook,” writes NPR’s Marie Andrusewicz. “The FBI is seeking information that may be on Farook’s employer-issued phone as it investigates the Dec. 2 shootings that left 14 people dead.”

At the time of the attack, Farook and his wife, Tashfeen Malik, destroyed two personally owned cellphones and removed a hard drive from their computer.

In what Apple described as a “customer letter” posted on its website late Tuesday, CEO Tim Cook said Apple will contest the judge’s order.

Apple said that giving technology to the F.B.I. that would allow them to crack iPhone encryption carried “implications far beyond the legal case at hand.”

Today’s Question: Should Apple comply with an order to unlock a San Bernadino shooter’s iPhone?

  • hoffingerte

    Yes, immoral acts warrant scrutiny. Privacy rights do not safeguard depravity.

  • Depends on the actual question.
    Should Apple comply with a subpoena to unlock the phone? Yes.
    Should Apple comply with an order to turn over the methods to unlock the phone? No.
    Besides, the phone does not belong to the defendant. It belongs to his employer. Why is the employer balking at unlocking the phone?

    • Mark in Ohio

      I would assume that the employer doesn’t know the password, and thus can’t unlock the phone.

  • Tony Rogers

    Apple is way too smart for this nonsense.
    Why are the “terrorists” driving slow with their Hazard Lights on??

    • Ruckabumpkus

      I don’t get it. What does that conspiracy theory have to do with this question?

      • Tony Rogers

        I don’t know… nobody is capable of answering my simple question.

        Why did the “terrorists” drive slow with their hazard lights on??

        • Ruckabumpkus

          May I suggest you ask that question in a forum where it’s on-topic?

          • Tony Rogers

            This article is about San Bernardino right??

            You can’t answer my simple San Bernardino question??

          • Ruckabumpkus

            Sorry, it seemed to me you were asking a rhetorical question in support of a conspiracy theory, and I have no intention of indulging you by attempting to answer it. And no, this “Today’s Question” is not directly about the San Bernardino shooting. It’s about whether Apple should be required to hack an encrypted iPhone so the FBI can access the info on it.

          • Tony Rogers

            “I have no intention of indulging you by attempting to answer”

            lol… You couldn’t justify their behavior even if you wanted to.

  • Mark in Ohio

    If the issue were that Apple had the password stored somewhere and was just required to turn it over, then yes. I believe that in this case, no-one other than the now deceased defendant has the password. It sounds like the judge is ordering Apple to rewrite their operating system to allow remote unlocking of this and other Apple devices. To this, I would say that no, they shouldn’t have to respond. That would be akin to requiring a bank vault manufacturer have an overriding keyslot that fits a common house door key. Could the judge compel the defendant to give up the password? Wouldn’t that violate his rights against self incrimination? On a more technical note, I also wonder how they could force a system update to a locked device, as would be necessary to install a backdoor?

    In the general form, If the US government says that they have to have a backdoor, then any other country can say the same thing. I can’t see a reason that our government would have rights to order Apple around that other Governments couldn’t also use for their own interests. If this is the case, we might as well not have any security measures at all. Apple is right in saying that you can’t make a secure backdoor that hackers can’t exploit.

  • Ginnie


  • Rich in Duluth

    Yes, but…I think Apple should open the phone to access whatever “information” the government needs, however, turning over the technology to access any iPhone should not be done. The data access could be done at the Apple labs, supervised by government officials.

  • townail22

    Since this was an employer issued phone the employer should demand that Apple unlock the phone to make sure that it’s assets weren’t used in the attack.

  • KTN

    No. Apple is under no obligation to open that phone or any other phone to the government – regardless of request. If Apple is telling the truth, they don’t have the technology to provide that key, even if they wanted.
    The government is not really in the business of nuance, and once they have the keys, well, where will they want to reach next, your phone.

  • Sue de Nim


    I’m not worried so much that our government would misuse such information, but that it would get loose and be exploited by others. If the government is allowed to force tech companies to install back doors in their encrypted systems, there would be no way to keep industrious hackers from discovering them, and then foreign governments and huge amoral corporations would be able to exploit our private data.

    • Tony Rogers

      “I’m not worried so much” about the Lies and Government Malpractice.

      What about Bush’s WMD and the Snowden Documents??

      • Sue de Nim

        Bush’s lies about Iraq having WMDs is related to the issue of personal data privacy, how? As for what Snowden leaked, it was harmless as long as it was still secret, because the government couldn’t use it against Americans without revealing its existence. I regard Snowden as a traitor. In America I’m way more likely to be taken advantage of by business interests than oppressed by my government; hence the direction of my worries. I don’t traffic in conspiracy theories.

        • Tony Rogers

          Thanks to Snowden, we know how the federal government abuses private data.

          If you need proof, the documents are posted online:

  • Jon Bohlinger

    No. They wrote legal code and shouldn’t be forced to amend it. If the justice department wants future code to contain a backdoor, they can go to Congress.

  • MarkUp

    Yes. There’s already legal precedence supporting the courts request:

    As a feature, the lock was meant to boost consumer confidence in the product, not be an obstruction to justice.

    If a court can issue a search warrant on your house and break down your door to search through your belongings, why not the contents of an electronic device?

    I see a lot of arguments that we need to change the law to give the courts permission to do so, but the courts currently reference cases involving telecommunications. If we want the court to reference different laws for electronic devices, we need to pass laws smart devices from telephones.

    • Ann Nasses

      Because the technical reality would be more like the cops want a new key made that lets them into your house and gives them the mold to make a key for any house they may decide to search in the future while also being unable to prevent criminals (hackers) from also finding that mold and making keys to let the criminals into any house they want as well.

      • MarkUp

        That’s a good analogy. It reminds me of the “lock bumping” stories that came out over the last decade.

        I still feel the court orders and decisions the FBI reference support that they have the legal right to request companies change their practices in this way. The FBI has made no request to Apple they haven’t made to other companies in the past, and they’ve won the court challenges when its gone that far.

        To be more specific to this issue, here’s a great blog post from a security firm breaking down the specific security features the FBI are encountering and what specifically they’re asking for:

        The authors suggest this backdoor can be limited to “the specific recovered iPhone” they’re hacking. It’s an iPhone 5C, a device that does not have the security features in the next immediate iOS release (over 80% of iPhones in the market already have stronger security measures in place). Even if the backdoor was leaked, the key would be obsolete in less then 3 years.

        I guess I’m OK with Apple giving the FBI a master key to a lock on a phone don’t sell anymore.

        • Ann Nasses

          The FBI *is* asking for something new though. They have never before asked that a company create a brand new technology that makes the company’s product less secure. This is not just a matter of saying, “Ok, that phone’s password is 1234”. In fact, even Apple cannot currently access that phone’s content.

          As far as there being a limit on any damage done, I feel differently than you. Once the technology exists, it will be hacked; that is unfortunately proven with every new technology we create. It will also most certainly be able to be adapted for any new iOS that are developed, making Apple a business with above average insecurity.

          Even if we could somehow limit the damage to only the 5C, I am not okay saying that the FBI, criminials, and foreign governments can have access to my data for any length of time. And I am especially not okay saying that those people who are unable to afford to upgrade their phone to the newest model must live with that risk longer than those who have the means to replace their devices immediately.

  • Pearly

    The original owner of the phone, (the employer) should be able the get the San Bernardino terrorist password from apple

    • Lindsey

      According to Apple, if you forget your password, there is no way to get on the phone without erasing it and hoping that it was backed up.

  • Geezer44

    Yes, in order to save future lives Apple needs to comply and work with the FBI. What person wouldn’t give up their privacy if they knew it would prevent people from being killed?

    • Dave

      You propose a false choice.

    • Ann Nasses

      Those who would give up liberty in pursuit of security deserve neither. — Ben Franklin

    • kajar9

      Saying you don’t care about privacy because you have nothing to hide is the same as saying, that you don’t care about freedom of speech because you have nothing to say.

    • Ghoster

      I wouldn’t. To quote Ben Franklin, “those who would sacrifice liberty for security deserve neither”

  • Dave

    Absolutely not. To say nothing of privacy concerns and the extraordinary precedent this sets, the crime was already committed. Everyone was already dead before law enforcement knew anything about this crime.


  • Ruckabumpkus

    So, help me understand this. Apple designed this encryption system with the idea that even their own engineers wouldn’t be able to crack it without the password. If they were to try and succeed at complying with this court order, wouldn’t that mean their system was flawed? Could they then be sued for false advertising? And anyway, doesn’t the NSA have the best code breakers in the world? If it’s remotely possible to crack that phone’s encryption, why does the FBI need Apple to do this?

  • John Dilligaf

    As much as I’d love for the FBI to have the information inside the phone, I don’t think they should be able to compel Apple to engineer a back door into their system. Here’s an article that sums up what I was thinking nicely: