Despite constant news about online hacks and other security breaches, few iOS owners express much worry about the safety of their devices — even in Silicon Valley. While it’s true that the vast majority of malicious hacks on mobile happen on Android, iOS apps are not as safe as their users like to believe. No doubt, Apple has done an impressive job keeping the App Store secure. However, reflecting on my last couple of decades in Internet security, I see some worrying trends. Similar vulnerabilities that plagued PCs and the web in the ’90s and 2000s are now appearing on mobile, and my concern is this new generation of mobile developers do not even remember them.
Hacks, cracks, and server access — from the ’90s to now
Hackers have been around before websites even existed. As a burgeoning white hat security specialist in my teen years, I’d often track their attacks on a server’s daemon (or background services program) in an attempt to gain server access. Back then, the daemon ran as a single binary file (i.e. the executable program) on the server. By reverse engineering this binary and finding edge cases its developers missed, hackers would create exploits, and transmit them remotely to gain command access. What’s important here is that developers can’t think of and identify all the edge cases in their products. It’s inevitable that there will be holes.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1693960,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"mobile,security,","session":"B"}']This scenario hasn’t actually changed all that much in the mobile era. Hackers still search through binary files to create exploits to gain server access and admin privileges. And all too often, I see iOS developers repeating mistakes we made during the PC and web days, forgetting (or not knowing) to add security measures around every layer of the mobile ecosystem. These developers are under the illusion that the iOS ecosystem is completely safe, or that they don’t need to actively take measures to protect themselves. This is not the case.
For another parallel to PC hacking history, consider the rampant cracking of coveted programs like Photoshop. Few people paid for Photoshop when they could download it for free and use a generated license key. (Which is probably why Adobe finally started offering Photoshop and its other programs as an online monthly subscription service.) We see a similar phenomenon happening now with iOS apps. A quick search in hacker forums will turn up hundreds, if not thousands, of cracked iOS apps. The download numbers are sometimes comparable to legitimate downloads from the actual App Store.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Which takes me to my next point:
Human error and the App Store review process
Copycat iOS apps show up not just on pirate sites but in the official App Store, and with much more frequency than most assume. This is not directly the fault of Apple but is due to Apple’s human-centered review process. Unless a reviewer has infinite time to research every single app that has ever been submitted and published, it’s simply impossible to catch and filter out copycat apps — especially clones of lesser-known (but relatively well-performing) apps. App Store reviewers must review thousands of apps per week, and it’s time-consuming enough to merely check whether a given app runs as described, doesn’t crash, and doesn’t violate Apple’s basic guidelines. Inevitably, this process misses a lot of hacked or cracked apps — a serious security liability for users, and a grave economic blow to honest app developers.
Speaking of developers, they themselves often put iOS security at risk through human error:
Reverse engineering, malware, and jailbroken iOS devices
Many app developers typically have little experience with reverse engineering and, more importantly, are unaware of its consequences. iOS app binaries are self-encrypted, but that encryption can easily be broken with Clutch and other publicly available programs. Using a strong reverse engineering tool such as IDA enables anyone with enough skills to view the contents of files within a given app. Add to that the numerous online tutorials that show people how to use these tools, and you have an army of people with enough understanding to self-hack mobile applications.
This is bad enough, but jailbroken iOS devices make reverse engineering an even greater threat: It’s relatively easy to add malware to an iOS app running on a jailbroken device. This malware can come with SSLWrite “hooking” and similar functions to intercept users’ Apple ID and passwords. It’s also fairly easy to modify the communication between an app and a server to launch an MITM (man in the middle) attack, which swipes packets of information (such as passwords) and changes them before they’re transmitted to the server.
While jailbroken devices are a serious security risk, they’re also popular among users, so app developers (and Apple) are reluctant to take security measures to prevent apps from running on them. To block jailbroken devices is to deny a large group of early adopters from using and reviewing your app, and app reviews and ratings are critical to an app’s initial growth. Apple’s closed platform doesn’t allow developers to use a wide range of third-party security tools, but at the very least, companies should implement source code encryption/obfuscation and consider blocking jailbroken devices.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1693960,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"mobile,security,","session":"B"}']
But that’s only a start.
Securing Apple’s future — and ours
Apple should review, test, and approve other security solutions beyond what it has created to allow for additional security measures. While this would require additional resources for Apple to coordinate with third-party security vendors, it would lead to more options and higher security for developers — and more safety for their end users. As part of Apple’s review process, I strongly hope the company adds automated testing of apps for suspicious behavior and questionable permission requests. To Apple’s credit, it does have default app encryption in place; however, one type of security cannot cover all the edge cases or security flaws that an app may contain. Allowing the platform to be more flexible and accommodate third-party security solutions would make it much more difficult to crack iOS apps, leading to significant reduction in hacking.
As I write this, Apple is about to introduce its line of smartwatches, while rumors swirl that the company is about to enter the self-driving car business. Whatever Apple’s plans, third-party apps will likely be a critical part of them — giving the company all the more reason to address these concerns now. I have great admiration for Tim Cook and his team and am confident they will. The company’s future — and, to a great extent, ours — depends on them doing that.
Minpyo Hong has advised corporations, NGOs, and governments on digital security issues for over 20 years, and led a team of five-time finalists at Defcon. Hong is currently founder and CEO of Seworks, a San Francisco-based developer of advanced security solutions for the mobile era.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1693960,"post_type":"guest","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"mobile,security,","session":"B"}']
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More