Fitbit? Pollination? Jaguars? Snakes? Mason jars?
OK, fine, Facebook, I’m not surprised that I’ve clicked on those things. But when did I ever click on anything related to Star Trek: Voyager? Or Cattle?!
My “this feels weird” reaction makes me one of the 51% of Facebook users who report that they’re not comfortable that the ad-driven company creates a list that assigns each of us categories based on our real-life interests.
It’s called “Your ads preference.” You can view yours here. If you drill down, you can see where Facebook gets its categorization ideas from, including the things we click on or like, what our relationship status is, who employs us, and far more.
Most people don’t even know that Facebook keeps a list of our traits and interests. In a new survey from Pew Research Center that attempted to figure out how well people understand Facebook’s algorithm-driven classification systems and how they feel about Facebook’s collection of personal data, the majority of participants said they never knew about it until they took part in the survey.
Overall… 74% of Facebook users say they did not know that this list of their traits and interests existed until they were directed to their page as part of this study.
Once the participants were directed to the ad preferences page, most – 88% – found that the platform had generated material about them. More than half – 59% – said that the categories reflected their real-life interests. But 27% said that the categories were “not very” or “not at all” accurate in describing them.
And then, after they found out how the platform classifies their interests, about half of Facebook users – 51% – said they weren’t comfortable about Facebook’s list creation.
The Pew Research Center’s conclusions come out of a survey of 963 US adults over the age of 18 who have a Facebook account. It was conducted from 4 September to 1 October, 2018. You can see the full methodology here.
What inputs does Facebook’s algorithm chew over?
The “Your ad preferences” page, which is different for every user, is only one factor that Facebook uses to slice users’ lives into categories to which advertisers can market. Unless you’ve drilled down on that page’s categories and told it to forget about certain things that you’ve posted, liked, commented on or shared, all of that activity will be taken into account by the algorithm.
But as we well know, Facebook also follows us around off the site. One of the outcomes of the two days of Congressional grilling that Facebook CEO Mark Zuckerberg went through in April 2018 was that Facebook coughed up details on how it tracks both users and non-users when they’re not even on the site.
In explaining why Facebook collects non-users’ data, David Baser, Product Management Director, said that one tool, Audience Network, lets advertisers create ads on Facebook that show up elsewhere in cyberspace. In addition, advertisers can target people with a tiny but powerful snippet of code known as the Facebook Pixel: a web targeting system embedded on many third-party sites. Facebook has lauded it as a clever way to serve targeted ads to people, including non-members.
Beyond those tools, Pew Research Center said that Facebook also has a tool that enables advertisers to track users who’ve “converted”: in other words, they saw or clicked on a Facebook ad and then gone off and purchased whatever it was for. Bear in mind that you can opt out of that on your ads preference page.
Out of the ocean of data that comes from all those sources, Facebook knows us by demographic, by our social network and personal relationships, our political leanings, what’s happening in our lives, what foods we prefer, our hobbies, what movies we watch, what musicians we shell out money to hear, and what flavor of digital device we use. That’s a lot of grassland for advertisers to graze on.
It’s no surprise, then, that 88% of Facebook users in the study said they were assigned categories, while 11% found, after being directed to their ad preferences page, that they don’t exist in slice-and-dice advertising terms: they were told that they “have no behaviors.”
Political and racial buckets
The study asked targeted questions about two touchy, highly personal subjects in Facebook’s bucket lists: political leanings and racial/ethnic “affinities.”
The study found that Facebook assigns a political leaning to about half of Facebook users – 51%. Out of that group, 73% said that Facebook got it very, or somewhat, right. But 27% reported that it describes them “not very” or “not at all” accurately. In other words, 37% of Facebook users are assigned a political label that they say describes them well, while 14% get put into a box that they feel doesn’t fit.
Pew Research Center found that people who describe themselves as moderates are more likely than others to say that Facebook didn’t classify them accurately. Some 20% of those who say they’re liberal and 25% of those who describe themselves as conservative disagree with the labels Facebook assigns to them, but 36% of people who call themselves moderates say that Facebook mis-categorized them.
When it comes to race/ethnicity, Pew Research Center reminds us that this is all about selling ads, rather than categorizing what race or ethnicity we are. Hence, it’s about “multicultural affinity” that can be used for targeted marketing.
Do you have an affinity for African American or Hispanic music, for example? Do you like and share rap videos or Latino love songs? If so, you could be a tasty target for whoever’s selling things to those markets. Then again, you could be the group that they’d just as soon rather snub.
As Pew Research Center points out, this particular category has been controversial, as advertisers have chosen to exclude marketing to certain groups. From its article:
Following pressure from Congress and investigations by ProPublica, Facebook signed an agreement in July 2018 with the Washington State Attorney General saying it would no longer let advertisers unlawfully exclude users by race, religion, sexual orientation and other protected classes.
The Pew study found that out of the users whom Facebook assigned this type of affinity, Facebook labelled 43% of them as being interested in African American culture, while 43% were assigned an affinity with Hispanic culture, and 10% were assigned an affinity with Asian American culture. Those are the only racial/ethnic affinities Facebook tracks in the US.
How accurate are the platform’s algorithms at guessing where our racial/ethnic affinities align? Sixty percent of study participants said that Facebook got it “somewhat” or “very” right, 37% said the company got it wrong – they don’t have a strong affinity for the group that Facebook suggested. And while 57% of those assigned a group reported that they do consider themselves to be a member of that group, another 39% said nope, that’s not me.