What is a friend, exactly? It’s a tricky question.
Too tricky even for the computing might of Facebook.
The Social Network is still some way short of total omniscience, so its Menlo Park boffins (apparently unconvinced by our ability to identify friends unaided) have to resort to rummaging through our virtual stuff looking for clues pointing to undeclared, undiscovered “People You May Know“.
Facebook describes how it does its rummaging in the following, vague terms:
People You May Know are people on Facebook that you might know. We show you people based on mutual friends, work and education information, networks you’re part of, contacts you’ve imported and many other factors.
Those “other factors” are a secret and how they interact has captured the attention of Fusion editor Kashmir Hill.
In June, Hill published an interview with a father who attended a gathering for suicidal teens. The father was shocked to discover that following the highly sensitive meeting one of the participants duly appeared in his People You May Know feed.
The only thing the two people seemed to have in common was that they’d been to the same meeting.
According to Hill:
The two parents hadn’t exchanged contact information (one way Facebook suggests friends is to look at your phone contacts). The only connection the two appeared to have was being in the same place at the same time, and thus their smartphones being in the same room.
Facebook’s response to the claim left Hill with “reportorial whiplash”, as she called it.
Facebook first suggested that location data was used by People You May Know if it wasn’t the only thing that two users have in common, then said that it wasn’t used at all, and then finally admitted that it had been used in a test late last year but was never rolled out to the general public.
So Hill was mistaken in her initial claim that “Facebook is using your phone’s location to suggest new friends”, but (if we assume his story is genuine) the outcome for the father was still the same.
Whatever confluence of events that lead to Facebook making recommendations that exposed sensitive information via People You May Know could presumably happen to others.
One of those others is a psychiatrist called Lisa (not her real name), another Hill interviewee who found out in the summer of 2015 that Facebook was suggesting her patients to each other in People You May Know:
“It’s a massive privacy fail,” said Lisa. “I have patients with HIV, people that have attempted suicide and women in coercive and violent relationships.”
Lisa lives in a relatively small town and was alarmed that Facebook was inadvertently outing people with health and psychiatric issues to her network.
Lisa was convinced that Facebook must have been using location data to suggest her patients to each other but Hill suspects that what they have in common is Lisa’s phone number:
[Lisa] was surprised to see that she had, at some point, given Facebook her cell phone number. It’s a number that her patients could also have in their phones. Many people don’t realize that if they give Facebook access to their phone contacts, it uses that information to make friend recommendations; so if your ex-boss or your one-time Tinder date or your psychiatrist is a contact in your phone, you might start seeing them pop up in the “People You May Know” list.
The need to maintain patient confidentiality has prevented both Hill and Facebook from investigating the underlying cause of the events reported by Lisa any further.
If Hill is right and Facebook’s use of phone numbers can produce issues like this one then brace yourself; it’s about to inject itself with a hit of up to one billion more of them from WhatsApp.
The stories published on Fusion are shocking, and whilst it remains a possibility that the people involved were mistaken, or even lying, these are not lone voices.
Stories about Facebook’s ability to dredge up unsuitable or even dangerous matches, whether they’re OkCupid dates we’d rather not see again or protected journalistic sources, have been around for as long as the People You May Know feature itself.
People You May Know isn’t designed to out people or put them at risk, but when your algorithms are secret, ever changing and handling 1.7 billion active users your edge cases can affect a lot of people.
The edge cases are also an insight into how much more than 98 things Facebook really knows about you.
The shock that Facebook may have leaked the names of a psychiatrist’s patients to each other masks the shock that Facebook knows enough to be able to do that in the first place, and that even tech-savvy users like Lisa aren’t aware of just how much they’re sharing.
With a bit of luck, you’ll avoid the worst sort of mistakes that Facebook’s algorithm might make, but even if you do the algorithm’s still there, gathering data, making connections and learning all about you. And you can’t turn it off.
Not unless you leave.