Apple, it seems, just can’t win when it comes to openness.
Or lack of it.
You’ll find many articles right now that suggest that Apple somehow “forgot” to apply its usual obfuscations to the iOS kernel when it released its recent iOS 10 Beta.
This has led to the thorny question, “Why?”
Proprietary software products, even if they contain open source components, as both Apple’s and Google’s operating systems do, are often built from source code in a way that deliberately makes them much harder to reverse engineer.
Malware often does much the same thing, of course, all the way from a simple scrambling of tell-tale text such as URLs and pop-up messages, to heavy-duty rewriting of scripts so that they are as good as illegible and incomprehensible to human analysts.
(Ironically, as far as malware is concerned, going to great lengths to hide oneself is itself a useful tell-tale for machine analysis, for all that it increases the effort needed to come to a precise understanding of the scrambled code.)
Web developers do this sort of thing, too, to try to protect their intellectual property.
Commercial PHP scramblers such as ionCube, for example, deliberately turn human-readable, text-based PHP files into equivalent but hard-to-figure-out gobbledegook.
This helps to protect programming details that have competitive value, such as nifty algorithms that save space or time.
Google, likewise, actively encourages Android developers to use its ProGuard tool before shipping their final versions:
[Proguard] optimizes [your app’s] bytecode, removes unused code instructions, and obfuscates the remaining classes, fields, and methods with short names. The obfuscated code makes your [Android package] difficult to reverse engineer, which is especially valuable when your app uses security-sensitive features, such as licensing verification.
So Apple’s sudden and apparent openness in its iOS 10 Beta firmware has led to an intriguing public debate:
- Is this a blunder by the product release team that somehow escaped unnoticed from quality assurance? If so, could this reduce security by making vulnerabilities easier for crooks to find?
- Is this a cunning plan by Apple to provide an unofficial way for interested parties to dig deeper into iOS of their own accord? If so, could this improve security by helping to dispel fears of “deliberate backboors” after the FBI recovered data from an encrypted iPhone by undisclosed means?
Calling all Apple fans/detractors!
What do you think? The floor is yours…
Note. You may comment anonymously. Just leave the name and email fields blank when you post.