Skip to content
Naked Security Naked Security

Secret Facebook groups being used by pedophiles to swap obscene images

They contain sexually explicit comments, obscene photos, and photos apparently stolen from newspapers, blogs, Facebook pages, school websites, and even clothing catalogs.

Pedophiles – including one who’s been convicted and is already on the sex offenders’ register – are using secret groups on Facebook to post and swap obscene or suggestive images of children, according to the BBC.

Any user can create a secret Facebook group.

Unlike closed groups, a secret group is practically invisible: it doesn’t show up in searches.

A secret group’s name, members, posts, and description are hidden away, and the only people who can see any of that content are those invited and added to the group.

A BBC investigation found a number of secret groups, created by and run for men who are sexually interested in children, including one being administered by a convicted sex offender.

By setting up a fake profile, the BBC managed to get invited into some of the groups.

It found that the secret groups contain sexually explicit comments, obscene photos, and photos that were apparently stolen from a range of sources: newspapers, blogs, Facebook pages, school websites, and even clothing catalogs.

Other images looked like they’d been taken secretly, up close, in public places. The BBC found one user had posted a video of a children’s dance show.

Most of the images were accompanied by obscene comments.

The BBC found groups hosting sexual images of children that have names clearly indicating their content: one that contained obscene material was titled “we love schoolgirlz”.

Another was named “cute teen schoolies” and contained a picture of a girl in a vest, aged 10 or 11, accompanied by the words “yum yum”.

Facebook told the BBC that the image didn’t breach its community standards.

In fact, regardless of the lewd comments accompanying the images, most of the photos uncovered by the BBC didn’t breach Facebook’s community standards.

Those standards define sexual exploitation as including:

…solicitation of sexual material, any sexual content involving minors, threats to share intimate images, and offers of sexual services.

The image of the girl in a vest didn’t breach Facebook’s standards, so it stayed up.

In total, the BBC found 20 obscene images and reported them to Facebook using its report abuse button.

Some users took the images down themselves. Facebook removed four.

Half are still up.

The BBC’s Angus Crawford writes that the publication was so concerned about some content that it handed it over to police and alerted both the Internet Watch Foundation and the National Crime Agency to its findings.

The publication talked with one woman who discovered that innocent pictures of her daughter had been stolen from her own blog site, then posted on a site for pedophiles to swap.

She said she was appalled:

I was horrified. Thinking that these innocent snapshots of my (then) 11-year-old daughter had become the subject of vile comments and disgusting exchanges between members of these groups was really upsetting.

But equally upsetting is the fact that Facebook allows these secret groups to exist, unmonitored and unchecked, making them rife for abuse by pedophiles.

There must be a duty of care to users to make sure that pedophiles can’t hide on these secret groups, stealing and sharing images of children they find online.

Children’s Commissioner for England Anne Longfield had this to say to the BBC about Facebook’s decision to leave up some of the material and groups:

I’m shocked those don’t breach community standards. Any parent or indeed child looking at those would know that they were not acceptable.

She also said that Facebook isn’t doing enough to protect children:

I don’t think at the moment, given what we know about the vulnerability of so many children to predators, that they are doing enough.

Andy Baker, former deputy chief executive of the Child Exploitation and Online Protection Centre, agreed with her assessment, most particularly with regards to secret groups:

I’m not on Facebook, and one of the reasons I’m not is because of what they don’t do.

[Secret groups open up] a complete network of opportunity to paedophiles. That’s why these secret groups should not exist.

The BBC caught up with Rishi Saha, Facebook’s head of public policy, at an event to mark Safer Internet Day last Tuesday, 9 February.

This is what he said:

When it comes to specific groups I think it’s really important that we investigate the groups, so if you’re able to share the details of these groups with me then I can work with my colleagues who do the investigations on these and make sure we’re investigating them and we’re removing the content that shouldn’t be there.

[Facebook will] deal directly with law enforcement to make sure they’re aware of these groups and follow the proper process.

I think it’s really important that we do that and I can give you that commitment that we’re going to do that.

I reached out to Facebook to get a bit more clarification of why it hasn’t taken down some of the images or groups, and to find out if the company is still committed to secret groups in light of how pedophiles are using them.

Facebook’s sent an official statement back in which the company reiterated what it said to the BBC, adding that it has “zero tolerance for child exploitation.”

This illegal behavior is rare on Facebook but it is immediately removed and reported to relevant law enforcement agencies when it is detected.

We believe that our comprehensive reporting system and community standards help make Facebook one of the safest places for people on the web. We urge people on Facebook to use the reporting tools available on every page so we can investigate and take swift action.

Facebook has a team of investigators to investigate child exploitation and works with agencies such as the UK’s Internet Watch Foundation to help build cases against pedophiles or other suspected criminals.

One of the most powerful tools it uses is PhotoDNA: a technology from Microsoft that computes hash values of all images on Facebook – included those in closed groups – in order to identify and flag known child abuse imagery.

In August, Facebook, along with Google and Twitter, hooked up with the Internet Watch Foundation in a hash-sharing agreement that the companies hoped would help in removing potentially millions of images.

Once Facebook gets a match on an image, it deletes the content and refers the details to the National Center for Missing & Exploited Children (NCMEC) so that the agency can contact local law enforcement agencies across the globe.

It’s not surprising that the hashes of many of the images found by the BBC don’t show up as known child abuse images, given that, going by the BBC’s description, they’re not sexually explicit.

The obscene comments that accompany the pictures are what transform something like a catalog photo or an innocent picture stolen from Facebook into a sexualized image for pedophiles.

Unfortunately for the horrified parents who stumble across such comments associated with their children’s images, those remarks constitute free speech, and as such wouldn’t breach Facebook’s community standards.

Even if Facebook were to decide to give up on secret groups, people would still be able to steal images of children and use them however they like on other sites.

One example is #BabyRP; an Instagram role-playing game involving other people’s kids.

Players photo-nap children, mostly harvesting the images from other people’s accounts, and list them as available for open adoption.

Others have found images stolen from their Facebook page that they thought were only visible to friends.

A woman from Utah last May found her 8-year-old daughter’s photo being used in a fake Instagram profile.

After digging around, she found more photos of herself and her 9-month-old son – all stolen from Facebook.

Cases such as these are what recently prompted German police to beseech people to stop posting pictures of their kids on social media.

As it is, a recent study found that the average parent is like a loving but voracious paparazzo, uploading an eyeball-popping 973 photos of their child on social media by the time he or she reaches the age of 5 – with 17% of them ignoring privacy settings altogether as they overshare.

Here are some tips on how to keep kids safe, and how to keep their images out of the hands of pedophiles:

  1. Think carefully about what you post. That picture of your toddler in the buff with nothing but cowboys boots on may be adorable, but if it’s not something you’d be OK showing a stranger, it’s best to keep it off the internet all together. The same goes for photos that don’t show naked children: if you don’t want your child’s privacy invaded, bear in mind that even innocent photos are liable to being stolen.
  2. Keep ALL your social media accounts private. That includes Facebook, Instagram or anywhere else where you post photos or personal information. As well, check your settings to see what people who search for you can find, even if your account is private.
  3. Think carefully about whom you accept as friends or followers. Even if someone seems harmless, you have absolutely no idea who they actually are. It’s better to be safe than sorry. If you use social media to keep in touch with family and friends, consider creating a private, or even a secret, Facebook group to share photos. Instagram also allows you to send photos to select people.
  4. If you share more widely, watermark your photos. Watermarking your photos discourages people from stealing them in the first place, but also helps someone to alert you if they’re stolen. Just be careful about putting your full name on them. Consider using your Instagram or Twitter handle instead.
  5. Turn off location tagging. In fact, #BabyRP is only one example of what strangers can do with location tagging. Another one was the cat map from Owen Mundy, a data analyst who took advantage of the millions of online cat images to show how geolocation can help predators (or cat stalkers) track down exactly where your cat (or your kid) lives.

Image of Man on laptop courtesy of Shutterstock.com

3 Comments

But I thought this is the kind of thing that only happens on Tor. And that’s why Tor needs to be taken down, right? Maybe Facebook should go next. Then on to Google Plus. Because I bet there is far more of this kind of thing going on right out in the open than there is on Tor.

Why is this even a news story? There are paedophiles on the internet? Yea that’s been happening for decades. FaceBook have no interest in their users privacy/safety? Nothing new there either ;^/

Comments are closed.

Subscribe to get the latest updates in your inbox.
Which categories are you interested in?