contador web Skip to content

Apple employees can listen to private recordings by Siri on iPhone | Downloads

Apple hires employees to listen to user recordings to train virtual assistant Siri. According to a publication by The guardian As of Friday, a third-party corps has access to audio snippets captured by the iPhone, Apple Watch and HomePod, often containing private information such as doctor-patient conversations, negotiations, drug buying and people having sex. Sought by the newspaper, Apple has confirmed that humans have access to a small amount of Siri audio to help detect technical failures.

The company, however, denies that it is possible to identify users through the recordings. A small part of the Siri commands made is analyzed to improve the wizard and dictation function. The person's requests are not associated with the user's Apple ID. Siri's responses are reviewed in secure facilities and all reviewers are under an obligation to adhere to Apple's strict confidentiality requirements, the company explained to The guardian.

All voice commands that the iPhone Siri can understand

Apple employees listen to user recordings to train Siri Photo: Marvin Costa / TechTudoApple employees listen to user recordings to train Siri Photo: Marvin Costa / TechTudo

Apple employees listen to user recordings to train Siri Photo: Marvin Costa / TechTudo

Want to buy a cell phone, TV and other discounted products? Meet the Compare TechTudo

The information surfaced after one of those employees, whose identity was not disclosed, revealed the work to the The guardian. According to the source, recorded content is provided to employees along with location, contact details and application data used by the handset owner.

He says Apple does not ask for details about what was heard. The only thing you can do is send a report pointing to a possible technical failure ie the company would only ask you if a particular recording indicates whether or not Siri was unintentionally activated.

Apple has confirmed that these employees' tasks involve pointing out situations where Siri is mistakenly activated. Recordings accessed by humans would, according to the company, only last a few seconds and represent less than 1% of everything the assistant captures.

Apple Watch the champion in activating Siri by mistake Photo: Thssius Veloso / TechTudoApple Watch the champion in activating Siri by mistake Photo: Thssius Veloso / TechTudo

Apple Watch the champion in activating Siri by mistake Photo: Thssius Veloso / TechTudo

However, the employee claims that the job involves listening to recording footage for several seconds to understand whether the user has purposely called the assistant or not. The champion of activations by mistake is Apple Watch. According to the source heard by the newspaper, the watch records about 30 seconds of snippets in which it would be possible to get a good idea of ​​what is happening.

In its terms of use, Apple says that data captured by Siri can be used to help the assistant, refine the dictation and better recognize the way the user speaks. However, the company does not make it clear that other people can hear snippets of recordings in this process.

It is worth remembering that Amazon has also entered into a policy involving its virtual assistant Alexa recently. At the beginning of the month, the company confirmed that it saves transcripts of dictated commands to Alexa even if the user decides to delete voice recordings manually. According to the company, this data is used so that Amazon Echo can understand when it is being called by the owner without having to contact the servers. It is not clear, however, whether the information is stored only on the device itself.

iPhone with iOS13? Learn all about Apple's new operating system

iPhone with iOS13? Learn all about Apple's new operating system