Skip to content
Naked Security Naked Security

Apple’s App Store hit by the XcodeGhost of malware present

Until this week, the App Store was to malware what Earth was to the Hitchhiker's Guide: "Mostly Harmless." Not any more...

You’ve probably read all sorts of to-and-fro about Apple’s App Store this week.

Until now, the App Store has been to the malware scene what the planet Earth was to Douglas Adams’s HHGttG: Mostly harmless.

That changed a few days ago, when Palo Alto networks published a series of articles about malware that had shown up in the App Store.

This malware not only sailed past Apple’s security vetting process, but also originated from software vendors you wouldn’t expect to be involved in malware creation and distribution.

When Palo Alto began to unravel the how, to go with the what and the when, things quickly got interesting.

Fascinating, in fact.

What happened?

In a nutshell, here’s what seems to have happened:

Chinese cybercriminals produced a “cooked” remix of Apple’s Xcode development toolkit, a multi-gigabyte download that you usually get from the App Store. Xcode is free, so a pirated version sounds pointless, but the theory seems to be that the cooked version was available locally from Chinese servers and was therefore promoted as faster and easier to download.

The cooked version was, in fact, downright crooked, because the hackers mixed in some “secret sauce” with their locally-sourced download.

• The secret sauce was very different from the usual sort of malware we find stuffed into pirated apps. Instead of directly affecting the developer’s computer, the booby-trapped Xcode, dubbed XcodeGhost, indirectly infected iOS apps when they were compiled.

→ You can see where this is going: developers with sloppy security practices, such as using illegally-acquired software of unvetted origin for production builds, turned into iOS malware generation factories for the crooks behind XcodeGhost.

The resulting infected iOS apps had malware buried in parts that looked like Apple-supplied components. Apple let many of these apps through App Store validation, presumably because the parts compiled from the vendor’s own source code were fine.

Numerous booby-trapped apps made it into the App Store before the alarm was raised.

What happened next?

Apple’s response, as reported by the BBC, was understandable:

We've removed the apps from the App Store that we know have been created with this counterfeit software. We are working with the developers to make sure they're using the proper version of Xcode to rebuild their apps. [Christine Monaghan of Apple.]

Quite right, too.

Back when compilers and development tools cost thousands, or tens of thousands of dollars, you could understand, albeit not condone, the use of pirated versions of trusted development tools.

But modified pirated versions of a download that’s free anyway?

If you’re going to get an unofficial copy of a free download that you then use to build software you expect other people to trust, at least go to the trouble of verifying your download with someone who downloaded the real thing!

Of course, you can also argue that Apple’s App Store verification process shouldn’t have been so easy to trick.

We’re guessing here, but – as we suggested above – it’s reasonable to assume that the malware evaded Apple’s checks at least in part because of where it ended up in the booby-trapped apps.

In a way, it’s a bit like the difference between spotting a hacked ATM that has a fake PIN pad glued onto it by a “outside” crook, and spotting a hacked point-of-sale terminal that was modified during manufacture by an “inside” supply-chain crook.

Let’s hope Apple upgrades its App Store validation process to assume that all parts of an externally-built application may be malicious, not merely the parts where you might most expect malware to be found.

After all, the App Store vetting process isn’t an on-access virus scan that needs to complete within a few tenths of a second so as not to be invasive.

Similarly, the vetting process isn’t like a real-time web gateway filter that needs to avoid holding up downloads to the point that users try to bypass the control.

What to do?

We suggest the following:

  • Apple should exactly identify all the apps (and their version numbers) that have been or will be removed from the App Store. That will help users of affected software to ensure they now have a safe version.
  • Software vendors shouldn’t use ripped-off development tools. C’mon, guys!
  • Software vendors who have been using infected development environments should own up and list any of their app versions that were infected.
  • Apple should assume a completely hostile build environment for all parts of all apps submitted to the App Store.

→ Even a pristine installation of Xcode could be tricked into creating malware if it were run on an infected computer. That’s how the Stuxnet virus worked. It didn’t infect the SCADA development software. Instead, it infected the process control software module after it was built, during installation to the target device.

Where next?

The problem of untrustworthy software development environments is an old chestnut in computer security.

Ken Thompson, one of the most famous Unix gurus of all time, dealt with this problem in his acceptance speech for the 1983 Turing Award.

He called the lecture Reflections on Trusting Trust, and described a booby-trapped compiler that would add a backdoor to the login program when you rebuilt your system.

In other words, if you had security concerns about your system, you could rebuild the whole thing from trusted source code…

…but that didn’t solve the problem of how to trust the compiler needed to rebuild the compiler to rebuild your operating system from trusted source code. (You may want to read that sentence two or three times!)

Perhaps App Store submissions will require full source code in future?

You compile it; Apple compiles it; if and only if the two packages agree, your App goes forward for further analysis.

We imagine there would be an outcry if Apple were to enforce this – What? Share everything with Apple, even my source code?

But so-called double-compilation is a respectable approach to countering the problem of trusting trust.

It would certainly help to confirm that both third-party developers and Apple were singing from the same code generation hymn-sheet.

Image of ghost and pumpkin courtesy of Shutterstock.


Good article but left out critical information on the malware itself, i.e. how the malware would behave or what information it would steal if I had one of the infected apps installed. Would it be localized to that of the app (because of iOS sandboxing) or would the whole device be comprised including all other apps installed on the device. Could the device firmware then be comprised as well?


Point taken.

But here’s what you can do: give Palo Alto some link love and click through to the articles mentioned above!

Loosely speaking, booby-trapped compiled apps include various basic spyware-like capabilities. These include popping up local alerts, handy for realistically phishing for usernames and passwords, and sniffing through the clipboard (handy for recovering recently-used PII.

As for “what would the effect be on each and every compiled app,” well, that depends on how the rest of the app is coded, what it does, and so forth. Until Apple and the afflicted vendors ‘fess up, who can say?

(Even with Palo Alto’s list, augmented by other lists and by those vendors who have done the right thing and owned up, how to tell which paid apps that have now been retired or replaced might have been booby trapped in the past, and how they would have behaved if used?)

One problem with a walled garden is that for those of us looking in from the outside, it can be hard to tell what’s really there. Unless you expect every anti-malware company to buy every paid app in every published version…

So, in my own defence, this article wasn’t really about the indivusual malware, which is one of those “we may never know exactly what happened unless Apple opens up” situations (and Apple has it as an operational security principle *not* to open up, whatever you think of that approach).

The article is more of an op-ed about how you deal with the metaproblem: defending a so-called trusted environment, or walled garden, against the sort of supply-chain-type attack that you didn’t originally prepare for.

As the headline is supposed to suggest, this is more about the issue of the “ghost in the App Store” than the malware in this particular phantasmic event :-)


Didn’t the NSA do this exact same thing last year? With the mention of enabling ease dropping with your own mic on the iphone.


I hear this was mainly in China and the Orient. So people who downloaded code from there because Apple was too slow or too occupied… got zonked.

Apple must not be a very big or wealthy company, if they cannot afford fast servers and internet. Unless, of course, the further away you are from California the longer a download takes.

The solution would be for Apple to open offices in the Orient (a major market, population wise)…..


Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?
You’re now subscribed!