A Russian AI-powered selfie-editing app that has been having a viral moment lately has become the center of a controversy after being called out over privacy concerns, TechCrunch reports.
FaceApp uses neural networks and artificial intelligence to add a few years and creates a pretty compelling version of the user’s older, grayer self. The app went viral globally again this week partly due to the FaceApp challenge that has celebrities sharing their future selfies along with the rest of us.
On Wednesday, a Twitter user posted that the app was uploading all of your photos to its servers. He later deleted those tweets after realizing they weren’t accurate, though he gave permission to use screenshots of said tweets. FaceApp was quick to clarify the conditions under which it handles users’ photos.
“FaceApp performs most of the photo processing in the cloud. We only upload a photo selected by a user for editing. We never transfer any other images from the phone to the cloud,” the Russian company said.
“We might store an uploaded photo in the cloud. The main reason for that is performance and traffic: we want to make sure that the user doesn’t upload the photo repeatedly for every edit operation. Most images are deleted from our servers within 48 hours from the upload date,” it said, adding that it doesn’t sell or share “any user data with any third parties.”
Another issue raised by FaceApp users was that the iOS app appears to be overriding settings if a user had denied access to their camera roll after people reported they could still select and upload a photo — i.e. despite the app not having permission to access their photos.
But the latter is actually allowed behavior in iOS — which gives users the power to choose to block an app from full camera roll access but select individual photos to upload if they so wish, TechCrunch writes.