Why Apple Thinks Hacking Into an iPhone Is a Bad Idea

Thanks to our friends at Business Insider, learn why the FBI wants Apple to hack into an iPhone — and why Apple thinks it's a bad idea.

On Tuesday a US judge ordered Apple to help the FBI unlock an encrypted iPhone.

The Cupertino, California-based company has reacted furiously.

Apple CEO Tim Cook has published an extremely strongly worded letter, calling the demand "chilling," arguing that it "would undermine the very freedoms and liberty our government is meant to protect."

So what's the big deal?

This was sparked by the San Bernardino mass shooting

First, some background:

FBI investigators are trying to access data on the phone of one of the two San Bernardino shooters who killed 14 people and injured 22 more in a mass shooting in California in December. They're looking to work out how the two were influenced by Islamist terrorist groups, according to The Guardian.

The phone's owner, Syed Farook, was killed in a subsequent shootout. The device in question, an iPhone 5c, was encrypted using Apple's default software, meaning no one, including Apple and the FBI, is able to access its data without the correct passcode.

The FBI has therefore taken Apple to court to try to get its help in unlocking the phone. It isn't trying to get Apple to remove the encryption on the device altogether; rather, it is trying to get Apple to create software that bypasses the limit on the number of passcode attempts you can enter before the device auto-wipes. That would let investigators gain access to the device by trying every possible combination.

A US magistrate on Tuesday ordered Apple to assist the FBI in this.

Apple has indicated it intends to appeal — for reasons we'll get to shortly.

There's an ongoing war over privacy and lawful access to data

This court case isn't taking place in a vacuum. We're in the middle of a bitter feud between tech companies and law enforcement about the rise in the use of encryption.

In the years after NSA whistle-blower Edward Snowden's revelations about the US government's mass-surveillance programs, there have been a heightened awareness of privacy issues and moves to strengthen protections on consumer products.

Apple has been one of the strongest voices in support of this move, and all new iPhones and Apple devices are now encrypted by default.

This has, predictably, infuriated some in law enforcement, who argue that vital evidence is "going dark." (Note: A recent Harvard study claims that rather than going dark, investigators have more evidence at their fingertips than ever before.)

James Comey, the director of the FBI, supports backdoors into encrypted products to allow law enforcement access when required, and there have also been legislative calls to mandate encryption backdoors.

Technologists and privacy advocates are strongly resisting this. There are numerous arguments against encryption backdoors, including that they would be subject to abuse by malicious hackers, that they would be ineffective because the criminals they intend to catch would simply switch to uncompromised encryption tools, and that it would set a dangerous precedent for authoritarian regimes to demand backdoor access from tech companies so they could crack down on activists and dissidents.

Apple is angrily rejecting 'overreach by the US government'

Let's get back to the San Bernardino case. What the FBI is asking for perhaps isn't a backdoor in the traditional sense — it's not an extra encryption key held in escrow that would let investigators immediately decrypt the iPhone data they're after.

But in an open letter published on Apple's website, CEO Tim Cook argues that it amounts to a backdoor — and that it's extremely "dangerous."

Cook says what the FBI is asking for does not exist, and Apple would have to make it. "The FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone's physical possession."

He argues that complying will make ordinary people "less safe." In his words (emphasis ours):

The government suggests this tool could only be used once, on one phone. But that's simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

The Apple CEO then describes the demand as a "dangerous precedent," which would grant the US government "the power to reach into anyone's device to capture their data."

"The government," Cook continued, "could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge."

He concludes: "While we believe the FBI's intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect."

Activists are rallying in support of Apple

The Electronic Frontier Foundation, a civil liberties group, is supporting Apple. It says it worries that the FBI's demands set a precedent and that if Apple is forced to create the code, then the code will be used again and again.

"For the first time, the government is requesting Apple write brand new code that eliminates key features of iPhone security — security features that protect us all," EFF deputy executive director Kurt Opsahl wrote in a blog.

"Essentially, the government is asking Apple to create a master key so that it can open a single phone. And once that master key is created, we're certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security. "

Opsahl added: "The US government wants us to trust that it won't misuse this power. But we can all imagine the myriad ways this new authority could be abused. Even if you trust the US government, once this master key is created, governments around the world will surely demand that Apple undermine the security of their citizens as well."


FBI isn't looking for access to this phone. They want Apple to develop a forensics back door for them. pic.twitter.com/Hrx1NZ5o3X

— Jonathan Ździarski (@JZdziarski) February 17, 2016


What happens next?

The court order ordering Apple to assist the FBI finishes like so: "To the extent that Apple believes that compliance with this order would be unreasonably burdensome, it may make an application to this court for relief within five business days of receipt of the order."

Cook has made it clear his company is opposing the order, because "we feel we must speak up in the face of what we see as an overreach by the US government."

Now comes a (most likely lengthy) legal showdown between the FBI and Apple — one that privacy activists and law enforcement will be watching extremely carefully.

Here's the full letter from Apple:

February 16, 2016

A Message to Our Customers

The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.

This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.

The Need for Encryption

Smartphones, led by iPhone, have become an essential part of our lives. People use them to store an incredible amount of personal information, from our private conversations to our photos, our music, our notes, our calendars and contacts, our financial information and health data, even where we have been and where we are going.

All that information needs to be protected from hackers and criminals who want to access it, steal it, and use it without our knowledge or permission. Customers expect Apple and other technology companies to do everything in our power to protect their personal information, and at Apple we are deeply committed to safeguarding their data.

Compromising the security of our personal information can ultimately put our personal safety at risk. That is why encryption has become so important to all of us.

For many years, we have used encryption to protect our customers' personal data because we believe it's the only way to keep their information safe. We have even put that data out of our own reach, because we believe the contents of your iPhone are none of our business.

The San Bernardino Case

We were shocked and outraged by the deadly act of terrorism in San Bernardino last December. We mourn the loss of life and want justice for all those whose lives were affected. The FBI asked us for help in the days following the attack, and we have worked hard to support the government's efforts to solve this horrible crime. We have no sympathy for terrorists.

When the FBI has requested data that's in our possession, we have provided it. Apple complies with valid subpoenas and search warrants, as we have in the San Bernardino case. We have also made Apple engineers available to advise the FBI, and we've offered our best ideas on a number of investigative options at their disposal.

We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.

Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone's physical possession.

The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.

The Threat to Data Security

Some would argue that building a backdoor for just one iPhone is a simple, clean-cut solution. But it ignores both the basics of digital security and the significance of what the government is demanding in this case.

In today's digital world, the "key" to an encrypted system is a piece of information that unlocks the data, and it is only as secure as the protections around it. Once the information is known, or a way to bypass the code is revealed, the encryption can be defeated by anyone with that knowledge.

The government suggests this tool could only be used once, on one phone. But that's simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

A Dangerous Precedent

Rather than asking for legislative action through Congress, the FBI is proposing an unprecedented use of the All Writs Act of 1789 to justify an expansion of its authority.

The government would have us remove security features and add new capabilities to the operating system, allowing a passcode to be input electronically. This would make it easier to unlock an iPhone by "brute force," trying thousands or millions of combinations with the speed of a modern computer.

The implications of the government's demands are chilling. If the government can use the All Writs Act to make it easier to unlock your iPhone, it would have the power to reach into anyone's device to capture their data. The government could extend this breach of privacy and demand that Apple build surveillance software to intercept your messages, access your health records or financial data, track your location, or even access your phone's microphone or camera without your knowledge.

Opposing this order is not something we take lightly. We feel we must speak up in the face of what we see as an overreach by the U.S. government.

We are challenging the FBI's demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.

While we believe the FBI's intentions are good, it would be wrong for the government to force us to build a backdoor into our products. And ultimately, we fear that this demand would undermine the very freedoms and liberty our government is meant to protect.

Tim Cook