Apple’s outsourced listen to private conversations by Siri, according to one of them

Apple's outsourced listen to private conversations by Siri, according to one of them

Apple is the chief guardian of privacy of the user in communication, but, in practice, not even the powerful Apple goes unscathed by the lack of privacy that is (almost) inherent in digital life. A new report by The Guardian shows that well.

The British newspaper heard an employee of a company hired by Apple to perform excerpts of audios processed by Apple. Crab – small voice clips recorded by your users’ assistant and sent for analysis to improve your speech recognition system. According to the employee’s testimony, the content of these recordings is far from being harmless or disposable.

According to him, outsourced employees often listen to highly private and sensitive recordings of users, with audios of people having sex, discussing treatments with doctors, closing confidential deals or carrying out criminal acts, such as drug negotiations. The frequency with which these audios are «accidentally» captured by Siri is high, according to the whistleblower.

It is worth noting that, as part of Apple’s differential privacy strategy, recordings are not accompanied by users’ personal data; in addition, Apple makes it clear in Siri’s terms of use that some of the recordings captured by the assistant can be reviewed by humans.

The problem here, the employee warns, is that the anonymity of these recordings is not so anonymous: in some cases, workers analyze excerpts of up to 30 seconds of audio, in which it is possible to easily listen to the user’s personal information and profile his. In addition, Apple would not make it explicit enough to its customers that Siri may have false positives in its activations – that is, it can “wake up” and start recording at a random time, without you calling it, simply by captured audio similar to your command.

In response to Guardian, Apple reiterated that Siri’s clips are always analyzed anonymously, and a random portion of less than 1% of the assistant’s activations is used daily for conferencing – most voice clips are a few seconds long at most . «All analyzers have an obligation to follow Apple’s strict confidentiality terms,» ​​said the company.

According to the official heard by the newspaper, however, the answer is not enough – other malicious analyzers could very well use the information obtained in the clips with malicious intentions against the company’s users. It is true that the chances of this happening are small, but, still, the reminder remains: absolute privacy, in the digital age, is a utopia – even with Apple.

via AppleInsider