There’s been a lot of hype recently over Meitu, a photo app that has been around in China since 2008. It seems the rest of the world is just now discovering it, and souped-up selfies are popping up across Facebook and other social media sites.
Even Naked Security editor Kate Bevan is playing with it – for research purposes, of course, and she has since uninstalled it.
As this thing goes viral, some security researchers are posting red flags about the amount of data it collects on its users.
Meitu – which means “beautiful picture” – offers a one-touch app to enhance one’s looks, and promises to give your selfies a “hand-drawn look”. It uses facial recognition and augmented reality. It now boasts an estimated 456m monthly active users across the globe.
Security researcher Jonathan Zdziarski is among those who are warning of the privacy risks:
To be fair, this is hardly the first app to do that, which Zdziarski is quick to point out. But it’s a risk all the same, and he warned journalists to focus on the big picture:
The Register reported that Meitu harvests information about the devices it runs on, including Android and iPhone. It includes “invasive advertising tracking features and is just badly coded. But worst of all, the free app appears to be phoning home some to share personal data with its makers,” the publication said.
The problem for security researchers and privacy advocates isn’t that it collects data. Some data collection is necessary for an app to be properly configured on a device. It’s obvious and understandable that Meitu needs to access your phone’s camera, for instance. The concern is about the other data collected, which includes your GPS location, Sim card and Wi-Fi connection details and cell carrier information.
So, should people stop using it? Not necessarily. But users should be aware of what the app is collecting, and make their choices accordingly.
A good starting point is to read the end user license agreement in full when you download the app. That way, you at least know what you’re getting into. Here’s some of it: