Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":1884537,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,security,","session":"C"}']
Guest

The FBI’s Apple hack would be a big, shiny gift to criminals

Image Credit: Sergey Nivens/Shutterstock

The FBI wants to compel Apple to write a new, modified operating system, load to it their central security key, and install it on a cell phone used by Syed Farook, one of the terrorists who carried out the San Bernardino attack. The request seems simple and harmless, as well as morally sound, at least on the surface. And since Apple writes updates to the IOS software all the time, the argument that it would be a huge burden for the company to create the OS needed to unlock this phone does not stand up to any rational inspection.

What Apple can’t do — what it shouldn’t do — is reassemble and release the secret key cipher that allows it to load its OS on phones. Without this step, it is impossible to unlock this phone. But here are the risks — to Apple and to the rest of us — if that happens.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1884537,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,security,","session":"C"}']

Erosion of trust and value

First, while I’m not privy to the exact way Apple manages its OS secret key, I believe it’s probably one of the most closely guarded secrets in the corporate world. It’s likely that no one single person manages the key and that the codes and permissions of many people are required to open it. Second, these individuals probably don’t have visibility or access to the fully assembled key and aren’t allowed to know where the assembly occurs. These are the kinds of security controls that would have to be in place for a $500 billion company to maintain the trust of its customers.

And that’s what this is about: Trust. The value of Apple can almost exclusively be defined as trust. I trust Apple with my personal data, my pictures, my conversations, my work data, my credit card data, and so on. Artists entrust their creative content (songs, movies, books, etc.) to  Apple, believing that they will be compensated and their content remain protected. Merchants trust the security of Apple Pay. If Apple ends up building this OS — and assembling the signature keys for this one phone — that trust goes away. Apple could lose billions of dollars of valuation because this trust was breached. Like a reputation, trust takes companies years to earn but can be destroyed in seconds.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

A slippery slope and criminal opportunity

Don’t be fooled by the false argument that this is a one-time request and that the secret keys and OS wouldn’t be leaked. Once the break-in is available, why should Apple refuse to unlock the phone of a regular criminal? Why not unlock the phone of a drug dealer to find his customers? Why not unlock a phone that was found near a crime scene? Why not unlock your phone to see if you are breaking the law? Furthermore, if this OS were built, criminals throughout the world would pay handsomely to get their hands on it. Are all the technicians who have the ability to do this trustworthy individuals? What if the coercion tactic was a death threat to their families?

Questionable effectiveness or benefit

Finally, I think we have to ask ourselves: To what end are the FBI’s requested efforts? The San Bernardino criminals had phones, computers and other devices that they used before, during, and shortly after the crimes and that they fully and completely physically destroyed. Do we think that Mr. Farook made his plots on his work phone and then forgot to bring it along when he destroyed his computer? I think we all know that this is unlikely.

Like others in my industry, I have a personal stake in the outcome of Apple’s case. For most of my career, I have provided communication networks to businesses. These businesses — my clients — expect their communications to be private and their data to be secured. While in certain countries we have an obligation to provide lawful intercept for public telephone calls if requested by a warrant, the technology to encrypt calls from source to destination — with no ability for a technology company in the middle to intercept, whether compelled to do so by a court or not — is available.

These end-to-end encryption tools are simple and widely available. Criminals can at any time write their own end-to-end encrypted apps and then not even Apple — nor all the king’s horses and men — can intercept. So again, the U.S. government wants Apple to put itself at grave fiduciary risk for no gain for the public at large. And criminals get an early start on their big, shiny gift of unbreakable communications.

We all need to think long and hard about what we will fight to keep private, because the government is coming with its emotional arguments, and it looks like they’ll leave no place unturned.

Curtis Peterson is senior vice president of operations at RingCentral.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1884537,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,security,","session":"C"}']

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More