Google and Facebook want as much of your data as they can get their hands on, because their survival depends on using information about you to better target you with advertisements.
Apple claims to be different – it sells hardware, like Macs, iPhones and Watches, not ads.
As Apple CEO Tim Cook said at the Apple Town Hall event in March, in reference to the company’s stand-off with the FBI over encryption backdoors, Apple products like the iPhone are “deeply personal,” and Apple has a “responsibility to protect your data and your privacy.”
And yet, Apple’s faced with a dilemma, because it wants to catch up to the likes of Google and Microsoft in using artificial intelligence and big data to make its products more responsive to users.
At its Worldwide Developer Conference (WWDC) this week, Apple unveiled what it hopes is the solution, called “differential privacy.”
Instead of sending your data to Apple to create a personal profile of you with your information, Apple says the new versions of its operating systems – iOS 10 and the replacement for OS X, called macOS – will use on-device intelligence and “crowdsourced learning.”
This means iPhones running iOS 10 can personalize your apps – like identify the people and objects in Photos, or serve you more relevant information in Maps and News – without sucking your data up to Apple’s servers.
Starting with iOS 10, Apple is using technology called Differential Privacy to help discover the usage patterns of a large number of users without compromising individual privacy. In iOS 10, this technology will help improve QuickType and emoji suggestions, Spotlight deep link suggestions and Lookup Hints in Notes.
On Macs, Apple will suck up your data, but differential privacy means no one – not Apple, intelligence service or attackers – can identify you individually:
Starting with macOS Sierra, Apple is using technology called Differential Privacy to help discover the usage patterns of a large number of users without compromising individual privacy. In macOS Sierra, this technology will help improve autocorrect suggestions and Lookup Hints in Notes.
At WWDC, Apple’s Craig Federighi said Apple can offer “great features and great privacy” through differential privacy.
Differential privacy is actually statistical analysis that protects individual privacy, rather than a single technology.
In its implementation, Apple will protect obscure data with multiple techniques, including hashing (turning data into unreadable characters), subsampling (using data from only a portion of users) and noise injection (adding random data to obscure real data).
Apple gave one of the most influential researchers in the field of differential privacy, Aaron Roth, a chance to review some of the math involved in its implementation, quoting Roth at WWDC as saying Apple is a “clear privacy leader among technology companies today.”
But not everyone is fully convinced that Apple can pull off the promise of differential privacy, at least not right away.
Matthew Green, a security technologist and professor of cryptography at Johns Hopkins University, said on Twitter that Apple is attempting to take differential privacy from theory to widespread implementation without much “practice” in between.