What you need to know about Apple’s privacy battle with FBI

By

Apple Security Jacket
This case is highlighting a major issue concerning iOS security.
Photo: Jim Merithew/Cult of Mac

The case involving San Bernardino shooter Syed Rizwan Farook’s iPhone 5c and whether Apple should help unlock it has brought the company’s stance regarding strong encryption to the forefront.

Since this privacy-versus-security debate isn’t going away anytime soon, here’s what you need to know about it so far — and why it’s a much, much bigger issue than just one legal case.

Why all the fuss?

In December 2015, shooters Syed Rizwan Farook and Tashfeen Malik murdered 14 people and injured 22 after opening fire at an office party in San Bernardino, California, in an apparent terrorist attack. After the shooting, the FBI discovered an iPhone 5c belonging to Farook, but the investigators have been unable to unlock it due to Apple’s encryption.

On iOS devices, important files are encrypted in such a way that users must unlock the phone with a manually entered passcode; user data will be wiped if enough incorrect PIN attempts are made. Yesterday, United States magistrate judge Sheri Pym requested that Apple give the FBI a custom firmware file allowing it to unlock the iPhone 5c in question.

The handset in question is a 2013-era iPhone 5c.
The handset in question is a 2013-era iPhone 5c.
Photo: Apple

Exactly what is being asked for?

The FBI wants Apple to build a special version of iOS that works only on the iPhone that has been recovered. This version would differ from regular iOS in three major ways.

Firstly, Apple would bypass or disable the auto-erase function for the device in question.

Secondly, Apple would enable the FBI to submit passcodes to the iPhone via the physical device port, Bluetooth, Wi-Fi or other protocol rather than having to enter each PIN attempt manually.

Finally, Apple would stop the iOS software from purposely introducing delays between passcode attempts. In standard iOS installations, these delays get longer as more wrong PIN codes are entered, with the time between attempts reaching one hour.

The FBI is unable to create its own iOS firmware and sideload it through DFU mode on the iPhone because the agents don’t have access to the keys Apple uses to sign the firmware. The federal court order demands that Apple provide the FBI with a signed iPhone Software file that can only run on the RAM of the terrorist’s iPhone, and then give the bureau remote access to the device.

It is argued that this could be carried on Apple’s campus, without the feds getting their hands in on the tech involved. The problem is that it would result in a master key which could theoretically be used by the FBI and others to hack every iOS device.

Can Apple do this?

A blog entry from Trail of Bits suggests that Apple has the power to do this, despite its strong iOS encryption. Security expert Dan Guido writes:

“Apple has allegedly cooperated with law enforcement in the past by using a custom firmware image that bypassed the passcode lock screen. This simple UI hack was sufficient in earlier versions of iOS since most files were unencrypted. However, since iOS 8, it has become the default for nearly all applications to encrypt their data with a combination of the phone passcode and the hardware key. This change necessitates guessing the passcode and has led directly to this request for technical assistance from the FBI.

I believe it is technically feasible for Apple to comply with all of the FBI’s requests in this case. On the iPhone 5C, the passcode delay and device erasure are implemented in software and Apple can add support for peripheral devices that facilitate PIN code entry. In order to limit the risk of abuse, Apple can lock the customized version of iOS to only work on the specific recovered iPhone and perform all recovery on their own, without sharing the firmware image with the FBI.”

If you’re interested in the specifics, Guido goes into far more detail on his blog about how Apple could overwrite the iPhone’s firmware with a version that conforms to all requested specifications — allowing the FBI to brute-force its entry onto the handset.

So what is the problem?

Tim Cook has repeatedly spoken out in favor of privacy.
Tim Cook has repeatedly spoken out in favor of privacy.
Photo: Jim Merithew/Cult of Mac

Right from the start, Apple has cast user privacy as a moral issue every bit as much as a technical one. In other words, just because Apple could conceivably hack an iPhone doesn’t mean that it should. In an open letter published today, Tim Cook explained his position:

“When the FBI has requested data that’s in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we’ve offered our best ideas on a number of investigative options at their disposal.

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.”

At the end of the letter, Cook suggests that — good intentions aside — the FBI may end up undermining “the very freedoms and liberty our government is meant to protect.”

While there’s no doubt that Apple is no supporter of terrorism, the company is definitely a proponent of strong encryption when it comes to keeping users safe. Those opposing ideological stances is why this issue is so much bigger than the single case in question.

What has the response been so far?

Mixed, although the majority of coverage has been in Apple’s favor. Big names including Steve Wozniak, Edward Snowden and others have rushed to support Apple, while the issue has even united Apple rivals like Google and Microsoft who have spoken out to support the company’s pro-encryption stance.

On the other side of the coin? The biggest name so far has been Presidential hopeful Donald Trump. But, hey, as Aaron Levie, CEO of Box, tweeted, “Simple security rule of thumb: don’t build encryption for how the world is today, but how it could be if Donald Trump were President.”

Why is this such a big story?

Because while privacy has been a big issue for the past several years, we’ve never reached an inflection point quite like this before. Apple’s strong stance for end-to-end encryption, and its support from Silicon Valley, represents a real battle for the future of technology as we know it.

Given Tim Cook’s pro-privacy advocacy, this could turn out to be the Apple CEO’s ultimate legacy during his stint as boss.

What happens next?

There are likely to be plenty more twists and turns before this case is settled, but while Apple has so far defied orders to unlock the iPhone 5c for the FBI, it’s been awarded a bit more time to comply by the U.S. magistrate who first handed down the order.

Apple’s response is now due in court on February 26, instead of Tuesday next week. Apple is also reportedly enlisting the assistance of free speech attorneys to help do battle with the government.

Deals of the Day

  • stickyicky97

    I am a big proponent of user privacy, but if the court issues a warrant to allow the FBI access to this terrorist’s phone or any potential criminal’s phone, then there should be a way to allow access. Keep in mind, I only agree with this in dire circumstances and with appropriate, highly scrutinized, court orders.

    • WolfB

      Unfortunately court orders are not always “highly scrutinized” and are often secret, especially at the federal level. Apple is trying to prevent opening a Pandora’s Box here. What happened in San Bernardino is deplorable, but if the FBI’s case somehow hinges on the data on one phone- they have a lot more problems than this.

      • aardman

        The history of law enforcement and the intelligence services has clearly demonstrated that any such opening will be abused and the abuse will be concealed.

    • There definitely should not be a way to allow access. Our privacy is more important. Even if there were information within the iPhone 5c that was recovered, it’s not worth sacrificing the privacy of every single iOS device ever created just to get it. That’s unacceptable. Don’t you understand how devastating it would be to allow the FBI special access? That would allow everyone in the world access to each others’ iOS devices. That doesn’t only refer to iPhones by the way.

    • neroden

      The court order is a “slave labor” order, ordering Apple programmers to work for free creating a piece of software which does not exist.

      This is well beyond the legal powers of the court or the FBI. It is a 13th amendment violation.

    • Gillian Zyland

      “…this terrorist’s phone or any *potential* criminal’s phone…”??? Seriously think about the implications of what you’ve written there. Maybe have a cup of coffee before you start giving in to the temptation to jump into the comments section and start advocating for the obliteration of our civil liberties. Just a thought.

    • site7000

      What court? The court in China who wants to crack a dissident’s or American businessman’s phone? China is literally drooling for the FBI to win this one.

  • GaelicSoxFan

    A back door for the good guys is also a back door for the bad guys.

  • yankeebobo

    The problem aside from obvious user privacy is precedence. If this case comes to forcing Apple’s hand at custom building a firmware for recovery, it will forever be known and referenced as People v Farook/Apple. And then it develops an easier argument next time.

    Apple may have the capabilities to build a firmware, and even one locked to the hardware address of the one iPhone in question, but this “good guy” approach, if fallen into “bad guy” territory, will expose far more implications down the road than it will solve here. They are talking about 18 minutes of dead air here. They cannot account for 18 minutes of time spent in the lives of the terrorists. And who’s to say that the 18 minutes would even be revealed through an unlocked/decrypted iPhone? The only thing they may garner off that is GPS signaling, and if that leads nowhere, then precedence is the only gain for the government in the future.

  • 919263

    Well, my 2 cents…. All this privacy BS is good, but just imagine, you are one of the Husbands/wife’s of a loved one that were shot down in this case, do you/ would you really care for all this Privacy BS.. or would you really want to know how to get the info so you can know how/why/who did this and supported this…

    • neroden

      Yes, you absolutely would care about the privacy of all Americans. If you think you woudn’t, well, please post all your credit card numbers and bank information online. Don’t want to? Right, you care about privacy.

      • 919263

        I am at a loss for words..You need to read up Dude if you are that.. Privacy and security are 2 different things… but I guess you would not understand

      • Peter Sichel

        It sounds like you need some help connecting the dots. All encryption depends on keeping secrets. If those secrets are ever revealed, it’s broken for everyone. Including your bank account number and password. That’s the whole point. Weakening encryption is a really bad idea.

  • Gillian Zyland

    Yankeebobo has brought up the single most important reason illustrating why Apple can’t possibly consider complying with this FBI request. You know all those police procedurals and other shows with lawyers? There’s often a crucial, game-changing scene where someone goes to court and a lawyer argues, “But there IS precedent, your honor! In the case of [Smith v. Jones, 1962], the court found that [blahblah blah and BLAH], which absolutely nullifies the argument that my opponent is making here — you’ve no choice now but to [X, Y & Z], sir — immediately!” And the judge realizes she’s over a barrel and grudgingly XYZ’s. THIS is what the FBI is looking for. With all these cases now that hinge on whether the government has the right to override contracts between users and communications device makers — most commonly cell phone manufacturers, this issue has become more and more controversial, and privacy has become a casualty of “top-secret” government investigations. The more that the FBI manages to pressure high-profile companies to comply with orders to break privacy contracts in media-magnet cases, the more privacy laws are eroded. It’s a slippery slope: once there’s precedent for X, that can be used in court to justify Y, and it won’t be long before its all rolling downhill and Z, Z, Z is being enacted for negligible gain, and all took to start that snowball on it’s journey of decline was for one big company, like Apple, to say “OK, we’ll do it.” And no matter how secret it is, and how many promises of one-time-only are thrown around, there’s no undoing the fact that the technology then exists. The software is there when it wasn’t there before. And then how not to say, “weeeellll, as long as it’s *there*, right?” And what a waste, to’ve spent so much time and energy creating a little app that could only be used *once*… If it’s there, it can be stolen, it can be hacked, the engineers who created it can be bribed and blackmailed and kidnapped and coerced — on and on. Whereas this can’t happen now, because the technology simply doesn’t exist, so there IS no weak link. That cuts off countless possibilities for abuse right there — before they even get started. And anybody seriously concerned about our civil liberties would have to be pretty intentionally obtuse not to understand that giving in to FBI pressure would be a huge crime against ALL Americans. Tim Cook is absolutely right.

  • Michael Johnson

    I’ve always thought that Apple already had a backdoor into their iOS devices, not to stop terrorists, but for their own personal gain. Think about it: how awesome would it be to be able to spy on any competitor with an iPhone?

    Anyway, couldn’t the FBI just send the iPhone to Apple and then Apple could hack it and send them the data?