What’s at Stake for Apple in iPhone Legal Case

Enter passcode screen of an iPhone running iOS 9

By Colin Poitras– UConn Communications

This story originally appeared in UConn Today.

In what some are calling the most important technology case of the decade, the FBI has obtained a court order compelling tech giant Apple to develop special software that will allow them to bypass security measures and unlock an iPhone belonging to one of the shooters in the San Bernardino mass shooting last December.

But Apple CEO Tim Cook is refusing to comply. Cook says the government’s request would force Apple to “hack our own users and undermine decades of security advancements that protect our customers.” The case has become the focus of a national debate pitting the government’s interest in protecting national security against the fundamental rights of companies and civilians to conduct their business without government intrusion.

Apple has until Feb. 26 to file its formal objections in court.

With the case continuing to capture daily headlines, UConn Today discusses the technical issues underlying the case with associate professor of computer science and engineering Laurent Michel. Michel is co-director of the Comcast Center of Excellence for Security Innovation at UConn, an advanced cybersecurity lab.

  1. What is behind Apple’s resistance to providing the federal government with modified software – a so-called backdoor – that would allow investigators to break into one of their phones?
  2. Once you modify software to create a backdoor, it can be used not only by the government on this specific phone but it could be used on other Apple devices as well. It can also be exploited by others, including the authors of malware. The moment you create a backdoor, even if it is with good intentions, it has the potential of being exploited. The government is downplaying the risks, and their argument rests on the stipulation that the software will be developed in such a way it will work on this one iPhone only. To do that, Apple would need to digitally sign the software to make it harder to break the tie-in. Yet Apple’s signing process is highly secure. The master key used for signing an Apple operating system or iOS is a key asset for the company that is highly protected and rarely used. If those keys were leaked or compromised as the result of a request like this, that would have dramatic implications. Apple also rightly insists on signing code that only meets specific quality standards. Here, it would be signing code with a deliberate vulnerability that could be exploited. This sets a dangerous precedent.
  3. The federal government has offered to allow Apple to immediately destroy the new software once the investigation is complete. But Apple has indicated that course of action isn’t enough. Why is that?
  4. Even if they say they will destroy it, it is a digital artifact. It is a piece of software. The moment that there is a weakness that is introduced in the device, it sets a terrible precedent. It sets the stage allowing anyone to recreate the same thing. If it’s been done once, it can be done again. Once software like this is created, what is stopping the government from making more and more requests of this type to technology companies?
  5. It’s been reported that Apple has cooperated with law enforcement on numerous investigations before and helped them break into suspects’ phones. What is it about this case that has become a line in the sand?
  6. Because Apple has changed their operating system. With earlier versions of Apple’s iOS operating system, it was much easier to recover information compared to this version. Starting with iOS7, Apple has made it, even for themselves, very difficult – without creating such a backdoor – to get into a device and recover encrypted data. Apple is taking the privacy of their customers very seriously. In this case, the FBI’s request includes three things. 1. The software would allow them to bypass the phone’s security measures so they could obtain the password. 2. The backdoor would remove any limits on the number of password attempts and would eliminate the delay one experiences when entering the wrong password. 3. Finally, the software would be tied strictly to the device they are breaking into. But again, it is a software attack and software can be changed. Once you have created the opportunity, the potential for repeating it and having an open backdoor is what makes Apple so uncomfortable.
  7. All things Apple aside, you’re an expert, what can people do to protect their personal information from hackers and others trying to access their data without permission?
  8. The smart devices that started appearing on the market a few years ago, like wearable electronics and smart phones, are commodity devices. They are not PCs where you have control and can increase the security of your system. There are well-known commercial solutions for PCs to encrypt your email and the files on your hard drive. But once you move your data to a commodity device, like a smart phone or a smartwatch, you cannot tinker with it. You have surrendered the protection of your data to the manufacturer, and have to trust them to take the proper steps to keep your data secure. It’s a delegation of trust. The moment the industry feels it has no choice and is compelled to create these backdoors, you must assume that whatever is on those devices is potentially public data. These days, many people replicate their data on multiple devices and storage solutions (e.g. the cloud). Some of those domains may be secure and some may not be. It is advisable to keep track of where each piece of sensitive data is held and replicated and what protection it enjoys in each case.

Published: February 26, 2016

 

Published: February 26, 2016

Categories: Uncategorized

Available Archives