Apple today filed a motion to vacate the U.S. Justice Department order that would compel it to help the FBI unlock an iPhone 5c that belonged to San Bernardino shooter Syed Rizwan Farook.
Attorneys for the FBI filed the 35-page motion in the U.S. District Court’s Central District of California on February 19. Now, less than 24 hours after Apple CEO Tim Cook stated on television that requiring Apple to break the iPhone’s encryption would be “bad for America,” the company has submitted its legal response to the court order.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1883680,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,cloud,dev,mobile,security,","session":"A"}']Today’s 47-page filing hits on many of the points Cook made on TV and wrote in his letter to customers last week, including his assertion that this case affects public safety and goes beyond the idea of national security versus privacy that has largely framed the issue until now. He stated that the desired software (once created) could be used on multiple phones, not just Farook’s iPhone, and claimed that the government has “cut off debate” by beginning this battle “behind closed doors” instead of in Congress.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
The filing begins with a strongly worded assertion that “the Constitution forbids” what the government is asking for. Further, the company states, complying with the government’s request would “require Apple to create full-time positions in a new ‘hacking’ department to service government requests and to develop new versions of the backdoor software every time iOS changes.”
“Although it is difficult to estimate, because it has never been done before, the decision, creation, validation, and deployment of the software likely would necessitate six to ten Apple engineers and employees dedicating a very substantial portion of their time for a minimum of two weeks, and likely as many as four weeks,” Apple says. “Members of the team would include engineers from Apple’s core operating system group, a quality assurance engineer, a product manager, and either a document writer or a tool writer.”
The effort would require quality assurance, testing, and the production of documentation, as well as procedures for securely using the technology if it is to be run outside an Apple facility. And Apple will have to record everything, naturally, “in case Apple’s methodology is ever questioned.”
As it has done before, Apple points out in today’s filing that other law enforcement agencies, in the U.S. and abroad, might well want the same technology that the U.S. government is currently seeking — so deleting such technology after it’s used in this case would not be very smart.
“Building everything up and tearing it down for each demand by law enforcement” would amount to what Apple calls here an “enormously intrusive burden.” But keeping the encryption-breaking technology in place would be just as big of a burden, Apple says. For one thing, hackers and other types will inevitably try to get at it.
And despite the patriotic statements Cook made on TV yesterday, Apple’s tone today is decidedly defiant. Nothing is going to force Apple to be “drafted into government service” to make software that meets the government’s needs, Apple states. And the company goes on to say that it is “not a ‘highly regulated public utility with a duty to serve the public,'” unlike the New York Telephone Co., a company that the U.S. government fought in court over the All Writs Act — the same 18th-century law the DOJ is using to attempt to make Apple unlock Farook’s phone.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1883680,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,cloud,dev,mobile,security,","session":"A"}']
“Nothing in federal law allows the courts, at the request of prosecutors, to coercively deputize Apple and other companies to serve as a permanent arm of the government’s forensics lab,” Apple states.
In challenging the legality of the government’s request, Apple specifically cites the First and Fifth Amendments.
“Under well-settled law, computer code is treated as speech within the meaning of the First Amendment,” Apple states in the filing, arguing that meeting the demands of the government would equate to “compelled speech.”
The company is also challenging the government on Fifth Amendment grounds, in part by indicating that it has only a very tenuous connection to the crime, and therefore the FBI is violating the company’s due process and right “to be free ‘from arbitrary deprivation of [its] liberty by government.'”
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1883680,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"bots,business,cloud,dev,mobile,security,","session":"A"}']
Finally, rather than using the popular “FBiOS” handle in his written testimony submitted to the court, Erik Neuenschwander, manager of user privacy at Apple, describes what the government wants as “GovtOS.” That term points to the fact it’s not just the FBI applying pressure here, as the DOJ and certain lawmakers have also begun calling for Apple’s compliance.
Our timeline of the case is here.
Harrison Weber contributed to this report.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More