Face ID distinguishes masks from real people

Digital privacy advocacy groups express concerns about iPhone X’s TrueDepth technology

THE iPhone X it has not yet reached the hands of (almost) any consumer around the world, but many aspects of its, say, “strong personality” have been giving their talk for quite some time.

Among these aspects, one of the champions of controversy is technology TrueDepth, which equips the device’s front camera and enables, through a series of ultra-advanced sensors, the 3D recognition of what it captures – like our faces, both for unlocking the device via Face ID and for Animoji strain firulas.

Today, some groups advocating for digital privacy have expressed concerns about the potential intrusions that technology may allow. THE American Civil Liberties Union (ACLU, that is, the Union for American Civil Liberties) and the Center for Democracy and Technology (CDT, or Center for Democracy and Technology) gave statements to the Reuters exposing a kind of “loophole” in Apple’s privacy policy related to facial detection technology – a loophole that, according to organizations, could compromise the privacy of iPhone X users.

Basically, the point is that Apple ensures that all Face ID and facial recognition module data is stored locally, on the device, and never sent to its servers or anything like that – this is an unquestionable and apparently inviolable rule . The problem, according to the associations, lies in the fact that Apple allows third-party developers to use TrueDepth technology in their own applications, and does not restrict actions taken with the content captured by these software. That is, theoretically, developers could detect and store users’ faces, their facial expressions and information related to attention on their own servers.

Face ID distinguishes masks from real people

Apple’s privacy policy makes it clear that developers have access only to a mathematical representation of facial recognition points, not to a visual map of users’ faces – that is, the data obtained by them would not be enough to, say, unlock the telephone. Furthermore, this data cannot in any way be sold by the developers, and users must explicitly allow the applications in question to access the front camera for detection to start working.

Still, the groups express concern about what developers can do once the facial data leaves the device and goes to their servers; Among the points of concern, a particularly sensitive one is Apple’s ability (or inability) to detect and deter potential applications that are violating company policies.

Particularly, the hair on the back of my neck stands on end and I also think about the prospect of an application that requires look for an advertisement until it ends and pause if you look away. Given that Apple’s policy allows developers to detect consumer eye movement and facial expressions, this would, in theory, be very easy. It is absolutely scary and dystopian and wrong.

The details of the situation, of course, will only become clear to the general public when a large number of users are using the iPhone X and researchers can analyze their behavior in depth. For now, there is only the alert and the proposal of the discussion – which, in the end, is the old and important discussion since the internet is the internet: how much of your privacy are you willing to give in order to have a more advanced service?

via The Loop