Privacy Groups Concerned That iPhone X App Devs Have Access To Face ID Data

Face ID

Technology is great, wonderful, groovy, and every other kind of positive adjective, except when it isn't. There are always downsides to be found, and the iPhone X with its fancy facial recognition technology—Face ID, as Apple calls it—is no exception. The problem raised by privacy advocates is that even though facial data used to unlock the iPhone X is securely stored on the handset, Apple's privacy policies do not cover apps that give developers access to such data.

App developers can tap into the iPhone X's facial data to build various different experiences for users, such as having a game character mimic a person's facial expressions in real-time, and other cutesy things of that nature. Apple's policy is to let developers access the necessary facial data if they are granted permission by the customer, and agree not to sell the data to third-parties, among other restrictions.

That is a good first step, but what customers need to understand is that once app developers are in possession of their facial data, it is no longer secured on their device. The developer is allowed to remove the facial data from the iPhone X and store it on their own servers, and there are a coupe of problems with that. For one, who can be sure a particular app developer's servers are secure from hacking? And secondly, remote storage of sensitive data limits how effective Apple's privacy policy can be.

iPhone X

"The privacy issues around of the use of very sophisticated facial recognition technology for unlocking the phone have been overblown," said Jay Stanley, a senior policy analyst with the American Civil Liberties Union. "The real privacy issues have to do with the access by third-party developers."

On the bright side, the data that is available to developers cannot be used to unlock a handset, as that process requires a mathematical translation of a person's face and not a visual map of their mug, according to Apple. The Cupertino outfit also points out that developers risk being kicked out of its App Store if they abuse a customer's data, a potentially costly proposition for the developer. Apple also reviews apps before they are approved, and routinely audits them.

The intent of giving developers access to facial data is for clever user experiences, and not anything sinister. Apple's policies prohibit developers from selling the data for advertising or marketing, and also forbids creating user profiles that could be used to identify anonymous users, the company's developer agreement states. So it's not as though Apple is being careless here, it just boils down to whether or not all developers will abide by the rules, or choose to abuse them.