contador web Skip to content

Digital Privacy Advocates Express Concerns About iPhone X's TrueDepth Technology

O iPhone X We have not yet reached the hands of (almost) any consumers around the world, but many aspects of your, say, ‚Äústrong personality‚ÄĚ have been giving you talk for a long time.

Among these aspects, one of the technology controversy champions TrueDepth, which equips the front camera of the device and enables, through a series of ultra-advanced sensors, the 3D recognition of what it captures as our faces, both for unlocking the device via Face ID and for Animoji strain fawns.

Today, some digital privacy advocates have voiced concerns about the potential breakthroughs that technology can allow. THE American Civil Liberties Union (ACLU, that is, the Union for American Civil Liberties) and the Center for Democracy and Technology (CDT, or Center for Democracy and Technology) made statements to the Reuters exposing a kind of "loophole" in Apple's privacy policy regarding breach facial detection technology is that, according to organizations, could compromise the privacy of iPhone X users.

Basically, the point is that Apple guarantees that all Face ID and Face Recognition module data is stored locally on the device, and never sent to its servers or anything like that is an unquestionable and seemingly inviolable rule. The problem, according to the associations, is that Ma allows third-party developers to use TrueDepth technology in their own applications, and does not restrict actions taken with content captured by such software. That is, theoretically, developers could detect and store on their own servers the faces of users, their facial expressions and information related to attention.

Face ID distinguishes masks from real people

Apple's privacy policy makes it clear that developers have access only to a mathematical representation of facial recognition points, not a visual map of users' faces – that is, the data they get would not be enough to, say, unlock the phone. . In addition, such data cannot be sold by developers at all, and users must explicitly allow access to the applications in front camera in order for detection to begin to function.

Still, groups express concern about what developers can do once facial data leaves the device and goes to their servers; Of particular concern is one particularly sensitive to Apple's ability (or inability) to detect and curb potential applications that are in violation of Apple's policies.

Particularly, the hairs on the back of my neck make me shudder to think about the prospect of an application that forces you to look Stop an advertisement until it's over and pause it if you look away. Given that Apple's policy allows developers to detect consumer eye movement and facial expressions, this would, in theory, be very easy. And absolutely scary and dystonic and wrong.

The details of the situation, of course, will only be clear to the general public when a large number of users are using the iPhone X and researchers can analyze its behavior in depth. For now, there is only the warning and the proposition of the discussion that, at bottom, the old and important discussion since the internet to the internet: how much of your privacy are you willing to give in to have a more advanced service?

via the loop