We recently reported that Apple was embroiled in a major controversy by proving that outsourced employees were listening to audio clips from users interacting with Apple. Crab, virtual assistant of the company.
Today, the company commented on its next steps on the subject in a press release.
At Apple, we believe that privacy is a fundamental human right. We create our products to protect users' personal data and we are constantly working to strengthen these protections. This also applies to our services. Our goal with Siri, the pioneering intelligent assistant, is to provide the best experience for our customers while protecting privacy with vigilance.
We know that customers have been concerned about recent reports of people listening to Siri audio recordings as part of our Siri quality assessment process we call rating. We listened to their concerns, immediately suspended the human classification of requests to Siri and began a thorough review of our practices and policies. We decided to make some changes to Siri as a result.
Crab and privacy
The company said it protects users' privacy by focusing on doing as many tasks as possible within the device itself, without resorting to servers and, consequently, data collection.
When this data needs to be stored, it does not use it to create marketing profiles or sell it to third parties. “We use Siri data only to improve Siri and we are constantly developing technologies to make Siri even more private.”
According to Ma, Siri uses as little data as possible to provide an accurate result. When you ask a question about a sporting event, for example, Siri uses your general location to provide appropriate results. But if you ask for the nearest bakery, more specific location data will be used.
If you ask Siri to read your unread messages, Siri simply instructs your device to read it aloud. The content of these messages is not transmitted to Apple servers, as this is not necessary to fulfill the request.
Siri uses a random identifier (a long sequence of letters and numbers associated with a single device) to track data while it is being processed, rather than linking it to your identity via your Apple ID or phone number, a process that Apple believes it is unique among digital assistants. For even more protection, after six months, this device data is disassociated from the random identifier.
On iOS, you can see the details about the data Siri accesses and how Apple protects this information at Siri and Search Settingsby tapping the “About Ordering Siri and Privacy” link.
How your data improves Siri
According to the company, for Siri to complete custom tasks more accurately, it collects and stores certain information from our devices. For example, when Siri finds an unusual name, it may use names from our contacts to ensure that such a person is properly recognized.
Siri also relies on data from our interactions with it. This includes the request audio and a computer generated transcript. Apple sometimes uses the audio recording of a request, as well as transcription, in a machine learning process that trains Siri to improve.
Before the company suspended this assessment, this review process involved only a small sample of audio from requests made to Siri less than 0.2%. This review served to see if Siri was responding accordingly and to improve its reliability, for example: did the user intend to activate Siri? Did she hear the request accurately? Did you respond appropriately to the request?
What is changing
As a result of this review since the program was suspended, Apple realized that it was not fully meeting its high ideals, and so apologized.
The program should be resumed at the end of this semester, when the new operating systems (iOS / iPadOS 13, macOS Catalina 10.15, watchOS 6 and tvOS 13) are released, but with the following news:
- By default, Apple will no longer keep the audio recordings of Siri interactions. She will continue to use, however, computer-generated transcriptions to help improve Siri.
- Users may choose to help Siri improve by learning from the audio samples of their requests. Apple, of course, expects many people to choose to help Siri (largely because of its policy). Those who choose to participate can cancel everything at any time.
- When customers accept, only Apple employees themselves will be able to hear audio samples from Siri interactions. This team will work to delete any recording that is determined to be an inadvertent trigger from Siri.
The company has finalized its statement stating that it is committed to putting the customer at the center of everything it does, including protecting their privacy.
We created Siri to help them get things done faster and easier without compromising their privacy rights. We are grateful to our users for their passion for Siri and for encouraging us to constantly improve.
For those interested, Apple addresses the issue further in this support article.