Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":1877951,"post_type":"opinion","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,cloud,dev,mobile,security,","session":"B"}']
Opinion

On Apple, the FBI, encryption, and why you should be worried

Image Credit: Matthew Keys

(By Matthew Keys) – Yesterday, a federal magistrate judge in Central California ordered Apple to create software that would help federal law enforcement agents with cracking security mechanisms on a phone allegedly used by a mass shooting suspect last December.

Apple, in response, released an eloquently penned letter on Wednesday in which its chief executive warned of the real privacy and security implications for its customers if it complied with the order, and the company’s chief executive encouraged a public discussion on the matter.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1877951,"post_type":"opinion","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,cloud,dev,mobile,security,","session":"B"}']

It was a call for dialogue Apple didn’t need to encourage: For years, a discussion has been brewing between privacy advocates and law enforcement on the matter of encryption — a system that authorizes only certain people or computers to access data, files and other information. But it reached a head yesterday when a very-visible company — Apple — was ordered to assist in a very-relatable law enforcement case—the San Bernardino terrorism investigation.

In its letter to customers, Apple’s Tim Cook says while the company abhors terrorists and terrorism, complying with law enforcement’s order to circumvent its security mechanism with respect to a phone that one of the San Bernardino shooters might have used would give unfettered access to police that could be exploited in nearly any kind of criminal, or even civil, investigation. Despite the very real, potentially detrimental effect Apple’s position might have on one aspect of the FBI’s terrorism investigation, many in the tech and privacy circles have sided with Apple on this matter. And to understand why, it’s important to explain how we got to this point.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

In 2013, a former government contractor named Edward Snowden leaked thousands of documents to a number of journalists and publications detailing secret — and in some cases, illegal— domestic and foreign spying programs operated by the U.S. federal government’s National Security Administration. Among other things, the documents revealed that NSA agents had managed to intercept unsecured online traffic, steal e-mails and text messages, eavesdrop on Skype and Yahoo Messenger calls and compromise the integrity of phones, tablets and computers from almost every major manufacturer running just about every kind of operating system in widespread use.

https://youtu.be/ZroffVT48-I

Although the government initially defended those programs as being limited to homeland security — usually terrorism— investigations, the documents also revealed that agents at the NSA and the British counterpart GCHQ had at times used their hacking techniques to eavesdrop and spy on people not associated with terrorism or even suspected of criminal wrongdoing. World leaders, former co-workers and even love interests were just some of the victims of the NSA and GCHQ’s warrantless and illegal dragnet surveillance.

Having exploitable devices and services not only puts millions of innocent customers at risk, it’s also bad for business. Noting this, many tech companies—among them, Apple, Google, Facebook, Twitter, Microsoft and Dropbox—moved to secure their hardware, software and services by adding extra layers of difficult or impossible-to-penetrate encryption.

Encrypting files and information works a lot like cable TV did back in the day: In most cases, you could plug a cable into the wall and get all the channels you wanted without any extra hardware. But certain channels like HBO, Showtime and the Playboy Channel (this was before most people had computers in their homes) were scrambled, and if you wanted to watch those channels, you needed authorization from your cable company to watch them. That usually involved paying a little bit more money and renting a cable box with a special computer card that could decode whatever extra channels you paid for.

Modern computers, smartphones, tablets and the Internet work a lot like those cable boxes did: Instead of extra hardware, they have software and techniques that keep data and information safe from people who aren’t authorized to see it. When you pay a bill using a credit card online, encryption keeps that credit card number from being intercepted by someone hijacking your Wi-Fi or tapping into your Internet connection. When you snap an erotic selfie using your passcode-protected iPhone, encryption means that photo of your junk won’t be seen by another person if your phone gets stolen.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1877951,"post_type":"opinion","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,cloud,dev,mobile,security,","session":"B"}']

Following the Snowden leaks, two phone manufacturers — Apple and Google— announced a series of unprecedented steps to fully encrypt all of the data on their respective phones (iPhone and Android) with the release of forthcoming operating systems. At the time, Apple said its method of scrambling user data would be so secure that it would prevent Apple’s own employees from accessing user data without a person’s authorization (in other words, without knowing the password or passcode), even if it was served with a lawful order like a warrant to hand that information over to police.

Naturally, this news made many in the law enforcement and government spaces unhappy. Apple was criticized for potentially thwarting terrorism investigations, making it difficult for police to prevent acts of terror or investigate incidents well after they had taken place. Some in the law enforcement community even claimed Apple would be an unwitting conspirator to terrorism, and suggested there might be legal ways of forcing the tech giant to comply with lawful orders for otherwise-inaccessible information.

One way to do this would be for Apple and other companies to offer law enforcement a master-key into phones and other devices. That approach is commonly known as a “back door,” and on Tuesday a federal magistrate ordered Apple to come up with such a method for the FBI’s investigation into the San Bernardino shooting.

Apple and other companies have long decried giving police back door access into its hardware and services, saying it could be exploited and abused in the same way as the NSA’s warrantless surveillance programs. What’s to stop police from expanding the use of a back door into other investigations, or worse yet, allowing such an exploit to fall into the hands of malicious hackers?

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1877951,"post_type":"opinion","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,cloud,dev,mobile,security,","session":"B"}']

Some have said that while Apple’s customer letter has made a point, it’s also full of “FUD,” or “fear, uncertainty and doubt.” But when it comes to Apple’s concern over creating a tool for law enforcement that might be abused and exploited, there’s no uncertainty and doubt about it — it’s happened before.

More than a decade ago, a military grade cellphone surveillance device known as a “StingRay” left the battlefield and wound up in the arms of federal law enforcement agents. A StingRay intercepts the signal of all cellphones within a given area, usually a city block, allowing FBI agents access to a vast trove of data like call logs and location information of handsets.

For years, FBI agents defended their use of StingRays by saying the hardware provided them crucial information for investigating homeland security-related cases, which many understood to be (because the FBI often says they are) terrorism investigations. But over the last few years, the devices have been leased out to local and state police agencies under various homeland security provisions, paid for with homeland security grants, for use in domestic homeland security cases.

Except that, as one police officer told me two years ago, the devices aren’t limited in use to terrorism cases. In fact, the officer said, those devices are used by law enforcement agencies “in criminal investigations with no restrictions on the type of crime.”

[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":1877951,"post_type":"opinion","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,cloud,dev,mobile,security,","session":"B"}']

https://twitter.com/samfbiddle/status/699970011927023616?ref_src=twsrc%5Etfw

In other words, a device that was meant to be limited to battleground use wound up being used by FBI agents in terrorism cases, then obtained by police for domestic homeland security cases, then expanded so that police could use them whenever they wanted. And that probably wouldn’t have been a problem had it not been for the fact that police often tried to cover up their use of StingRays by not getting a warrant beforehand, or by deceiving judges as to what exactly they were using and how they were using it (in some of those cases, criminal convictions were overturnedafter judges found that police didn’t follow the proper legal procedure before using the devices or concealed their use from defense attorneys, meaning criminals who otherwise would have been jailed wound up walking free).

And that’s exactly the kind of thing Apple is warning about now: Sure, the FBI says it needs that software right now to break the encryption on a terrorism suspect’s iPhone. But any method Apple comes up with won’t just be limited to that one phone — it will impair security measures on all phones used by all customers.

Apple argues that that is problematic, because it could open the door for the method to be misused and abused. What’s to stop judges from determining that back door method can be used in other cases? What’s to stop the FBI from giving that same access to local police, the same way the feds opened the door for police to use StingRays?

[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":1877951,"post_type":"opinion","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,cloud,dev,mobile,security,","session":"B"}']

Many have argued that the government is actually counting on this, that the court order is meant to set a precedent that opens the door for its applicability in other investigations down the line. The government, so far, seems to have rejected this narrative — but you can hardly blame those who are skeptical, or downright, distrusting of anything coming out of the federal government these days given their not-so-stellar reputation for upholding the privacy rights of citizens and their staunch opposition toward tech companies who move to secure the data and information of their customers.

Like many cases brought by the government, this isn’t a fight Apple wanted to find itself in. But it’s an important one to be had, and it’s one that the company isn’t backing away from. The outcome of the fight will determine whether phone, computer and Internet users have a legitimate and all-encompassing right to protect their data and information, or whether the government can arbitrarily decide for itself what rights to privacy its citizens have, when it can violate those rights and whether or not it needs a warrant or other judicial order to do so.

This story originally appeared on Medium.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More